[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 27712 1727096473.56082: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-And executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 27712 1727096473.57359: Added group all to inventory 27712 1727096473.57362: Added group ungrouped to inventory 27712 1727096473.57366: Group all now contains ungrouped 27712 1727096473.57773: Examining possible inventory source: /tmp/network-EuO/inventory.yml 27712 1727096473.89104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 27712 1727096473.89263: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 27712 1727096473.89289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 27712 1727096473.89344: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 27712 1727096473.89421: Loaded config def from plugin (inventory/script) 27712 1727096473.89423: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 27712 1727096473.89466: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 27712 1727096473.90062: Loaded config def from plugin (inventory/yaml) 27712 1727096473.90064: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 27712 1727096473.90153: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 27712 1727096473.91228: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 27712 1727096473.91232: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 27712 1727096473.91235: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 27712 1727096473.91240: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 27712 1727096473.91245: Loading data from /tmp/network-EuO/inventory.yml 27712 1727096473.91316: /tmp/network-EuO/inventory.yml was not parsable by auto 27712 1727096473.91586: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 27712 1727096473.91625: Loading data from /tmp/network-EuO/inventory.yml 27712 1727096473.91713: group all already in inventory 27712 1727096473.91720: set inventory_file for managed_node1 27712 1727096473.91724: set inventory_dir for managed_node1 27712 1727096473.91725: Added host managed_node1 to inventory 27712 1727096473.91728: Added host managed_node1 to group all 27712 1727096473.91729: set ansible_host for managed_node1 27712 1727096473.91729: set ansible_ssh_extra_args for managed_node1 27712 1727096473.91733: set inventory_file for managed_node2 27712 1727096473.91736: set inventory_dir for managed_node2 27712 1727096473.91736: Added host managed_node2 to inventory 27712 1727096473.91738: Added host managed_node2 to group all 27712 1727096473.91739: set ansible_host for managed_node2 27712 1727096473.91740: set ansible_ssh_extra_args for managed_node2 27712 1727096473.91742: set inventory_file for managed_node3 27712 1727096473.91744: set inventory_dir for managed_node3 27712 1727096473.91745: Added host managed_node3 to inventory 27712 1727096473.91746: Added host managed_node3 to group all 27712 1727096473.91747: set ansible_host for managed_node3 27712 1727096473.91748: set ansible_ssh_extra_args for managed_node3 27712 1727096473.91750: Reconcile groups and hosts in inventory. 27712 1727096473.91754: Group ungrouped now contains managed_node1 27712 1727096473.91756: Group ungrouped now contains managed_node2 27712 1727096473.91757: Group ungrouped now contains managed_node3 27712 1727096473.92041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 27712 1727096473.92167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 27712 1727096473.92422: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 27712 1727096473.92450: Loaded config def from plugin (vars/host_group_vars) 27712 1727096473.92452: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 27712 1727096473.92459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 27712 1727096473.92469: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 27712 1727096473.92514: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 27712 1727096473.93245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096473.93343: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 27712 1727096473.93590: Loaded config def from plugin (connection/local) 27712 1727096473.93594: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 27712 1727096473.94885: Loaded config def from plugin (connection/paramiko_ssh) 27712 1727096473.94889: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 27712 1727096473.96522: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 27712 1727096473.96563: Loaded config def from plugin (connection/psrp) 27712 1727096473.96566: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 27712 1727096473.97394: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 27712 1727096473.97435: Loaded config def from plugin (connection/ssh) 27712 1727096473.97438: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 27712 1727096473.99585: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 27712 1727096473.99627: Loaded config def from plugin (connection/winrm) 27712 1727096473.99635: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 27712 1727096473.99670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 27712 1727096473.99737: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 27712 1727096473.99816: Loaded config def from plugin (shell/cmd) 27712 1727096473.99818: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 27712 1727096473.99844: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 27712 1727096473.99918: Loaded config def from plugin (shell/powershell) 27712 1727096473.99921: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 27712 1727096473.99985: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 27712 1727096474.00173: Loaded config def from plugin (shell/sh) 27712 1727096474.00179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 27712 1727096474.00214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 27712 1727096474.00344: Loaded config def from plugin (become/runas) 27712 1727096474.00346: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 27712 1727096474.00549: Loaded config def from plugin (become/su) 27712 1727096474.00552: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 27712 1727096474.00725: Loaded config def from plugin (become/sudo) 27712 1727096474.00728: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 27712 1727096474.00761: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml 27712 1727096474.01111: in VariableManager get_vars() 27712 1727096474.01131: done with get_vars() 27712 1727096474.01261: trying /usr/local/lib/python3.12/site-packages/ansible/modules 27712 1727096474.04436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 27712 1727096474.04554: in VariableManager get_vars() 27712 1727096474.04559: done with get_vars() 27712 1727096474.04561: variable 'playbook_dir' from source: magic vars 27712 1727096474.04562: variable 'ansible_playbook_python' from source: magic vars 27712 1727096474.04563: variable 'ansible_config_file' from source: magic vars 27712 1727096474.04563: variable 'groups' from source: magic vars 27712 1727096474.04564: variable 'omit' from source: magic vars 27712 1727096474.04565: variable 'ansible_version' from source: magic vars 27712 1727096474.04565: variable 'ansible_check_mode' from source: magic vars 27712 1727096474.04566: variable 'ansible_diff_mode' from source: magic vars 27712 1727096474.04566: variable 'ansible_forks' from source: magic vars 27712 1727096474.04573: variable 'ansible_inventory_sources' from source: magic vars 27712 1727096474.04573: variable 'ansible_skip_tags' from source: magic vars 27712 1727096474.04574: variable 'ansible_limit' from source: magic vars 27712 1727096474.04575: variable 'ansible_run_tags' from source: magic vars 27712 1727096474.04576: variable 'ansible_verbosity' from source: magic vars 27712 1727096474.04623: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml 27712 1727096474.06588: in VariableManager get_vars() 27712 1727096474.06604: done with get_vars() 27712 1727096474.06639: in VariableManager get_vars() 27712 1727096474.06651: done with get_vars() 27712 1727096474.06791: in VariableManager get_vars() 27712 1727096474.06804: done with get_vars() 27712 1727096474.06912: in VariableManager get_vars() 27712 1727096474.06927: done with get_vars() 27712 1727096474.06962: in VariableManager get_vars() 27712 1727096474.07180: done with get_vars() 27712 1727096474.07230: in VariableManager get_vars() 27712 1727096474.07242: done with get_vars() 27712 1727096474.07303: in VariableManager get_vars() 27712 1727096474.07316: done with get_vars() 27712 1727096474.07320: variable 'omit' from source: magic vars 27712 1727096474.07338: variable 'omit' from source: magic vars 27712 1727096474.07605: in VariableManager get_vars() 27712 1727096474.07615: done with get_vars() 27712 1727096474.07663: in VariableManager get_vars() 27712 1727096474.07680: done with get_vars() 27712 1727096474.07715: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 27712 1727096474.08350: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 27712 1727096474.08489: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 27712 1727096474.10008: in VariableManager get_vars() 27712 1727096474.10031: done with get_vars() 27712 1727096474.10881: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 27712 1727096474.11325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27712 1727096474.15843: in VariableManager get_vars() 27712 1727096474.15865: done with get_vars() 27712 1727096474.15977: variable 'omit' from source: magic vars 27712 1727096474.15989: variable 'omit' from source: magic vars 27712 1727096474.16022: in VariableManager get_vars() 27712 1727096474.16037: done with get_vars() 27712 1727096474.16057: in VariableManager get_vars() 27712 1727096474.16076: done with get_vars() 27712 1727096474.16107: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 27712 1727096474.16521: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 27712 1727096474.16598: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 27712 1727096474.21240: in VariableManager get_vars() 27712 1727096474.21266: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27712 1727096474.27134: in VariableManager get_vars() 27712 1727096474.27158: done with get_vars() 27712 1727096474.27199: in VariableManager get_vars() 27712 1727096474.27225: done with get_vars() 27712 1727096474.27371: in VariableManager get_vars() 27712 1727096474.27392: done with get_vars() 27712 1727096474.27436: in VariableManager get_vars() 27712 1727096474.27456: done with get_vars() 27712 1727096474.27496: in VariableManager get_vars() 27712 1727096474.27514: done with get_vars() 27712 1727096474.27557: in VariableManager get_vars() 27712 1727096474.27576: done with get_vars() 27712 1727096474.27643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 27712 1727096474.27658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 27712 1727096474.27911: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 27712 1727096474.28089: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 27712 1727096474.28092: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 27712 1727096474.28122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 27712 1727096474.28147: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 27712 1727096474.28335: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 27712 1727096474.28397: Loaded config def from plugin (callback/default) 27712 1727096474.28406: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 27712 1727096474.30379: Loaded config def from plugin (callback/junit) 27712 1727096474.30390: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 27712 1727096474.30435: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 27712 1727096474.30509: Loaded config def from plugin (callback/minimal) 27712 1727096474.30512: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 27712 1727096474.30549: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 27712 1727096474.30622: Loaded config def from plugin (callback/tree) 27712 1727096474.30625: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 27712 1727096474.30758: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 27712 1727096474.30761: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_route_device_nm.yml ******************************************** 2 plays in /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml 27712 1727096474.30832: in VariableManager get_vars() 27712 1727096474.30845: done with get_vars() 27712 1727096474.30851: in VariableManager get_vars() 27712 1727096474.30858: done with get_vars() 27712 1727096474.30862: variable 'omit' from source: magic vars 27712 1727096474.30907: in VariableManager get_vars() 27712 1727096474.30922: done with get_vars() 27712 1727096474.30944: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_route_device.yml' with nm as provider] ***** 27712 1727096474.31527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 27712 1727096474.31607: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 27712 1727096474.31636: getting the remaining hosts for this loop 27712 1727096474.31638: done getting the remaining hosts for this loop 27712 1727096474.31641: getting the next task for host managed_node2 27712 1727096474.31644: done getting next task for host managed_node2 27712 1727096474.31646: ^ task is: TASK: Gathering Facts 27712 1727096474.31648: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096474.31650: getting variables 27712 1727096474.31651: in VariableManager get_vars() 27712 1727096474.31668: Calling all_inventory to load vars for managed_node2 27712 1727096474.31671: Calling groups_inventory to load vars for managed_node2 27712 1727096474.31674: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096474.31685: Calling all_plugins_play to load vars for managed_node2 27712 1727096474.31696: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096474.31699: Calling groups_plugins_play to load vars for managed_node2 27712 1727096474.31731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096474.31792: done with get_vars() 27712 1727096474.31798: done getting variables 27712 1727096474.31858: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:6 Monday 23 September 2024 09:01:14 -0400 (0:00:00.012) 0:00:00.012 ****** 27712 1727096474.31885: entering _queue_task() for managed_node2/gather_facts 27712 1727096474.31887: Creating lock for gather_facts 27712 1727096474.32391: worker is 1 (out of 1 available) 27712 1727096474.32400: exiting _queue_task() for managed_node2/gather_facts 27712 1727096474.32410: done queuing things up, now waiting for results queue to drain 27712 1727096474.32412: waiting for pending results... 27712 1727096474.32548: running TaskExecutor() for managed_node2/TASK: Gathering Facts 27712 1727096474.32603: in run() - task 0afff68d-5257-cbc7-8716-0000000000bf 27712 1727096474.32623: variable 'ansible_search_path' from source: unknown 27712 1727096474.32679: calling self._execute() 27712 1727096474.32826: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096474.32829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096474.32834: variable 'omit' from source: magic vars 27712 1727096474.32923: variable 'omit' from source: magic vars 27712 1727096474.32958: variable 'omit' from source: magic vars 27712 1727096474.33007: variable 'omit' from source: magic vars 27712 1727096474.33077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096474.33111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096474.33168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096474.33171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096474.33175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096474.33213: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096474.33221: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096474.33228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096474.33337: Set connection var ansible_connection to ssh 27712 1727096474.33382: Set connection var ansible_pipelining to False 27712 1727096474.33385: Set connection var ansible_timeout to 10 27712 1727096474.33387: Set connection var ansible_shell_type to sh 27712 1727096474.33389: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096474.33391: Set connection var ansible_shell_executable to /bin/sh 27712 1727096474.33417: variable 'ansible_shell_executable' from source: unknown 27712 1727096474.33425: variable 'ansible_connection' from source: unknown 27712 1727096474.33432: variable 'ansible_module_compression' from source: unknown 27712 1727096474.33439: variable 'ansible_shell_type' from source: unknown 27712 1727096474.33491: variable 'ansible_shell_executable' from source: unknown 27712 1727096474.33494: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096474.33496: variable 'ansible_pipelining' from source: unknown 27712 1727096474.33498: variable 'ansible_timeout' from source: unknown 27712 1727096474.33500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096474.33666: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096474.33684: variable 'omit' from source: magic vars 27712 1727096474.33693: starting attempt loop 27712 1727096474.33700: running the handler 27712 1727096474.33733: variable 'ansible_facts' from source: unknown 27712 1727096474.33776: _low_level_execute_command(): starting 27712 1727096474.33780: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096474.35027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096474.35085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096474.35272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096474.35284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096474.36995: stdout chunk (state=3): >>>/root <<< 27712 1727096474.37176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096474.37180: stdout chunk (state=3): >>><<< 27712 1727096474.37182: stderr chunk (state=3): >>><<< 27712 1727096474.37398: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096474.37402: _low_level_execute_command(): starting 27712 1727096474.37406: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096474.3728993-27750-29169466837309 `" && echo ansible-tmp-1727096474.3728993-27750-29169466837309="` echo /root/.ansible/tmp/ansible-tmp-1727096474.3728993-27750-29169466837309 `" ) && sleep 0' 27712 1727096474.38536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096474.38590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096474.38931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096474.38959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096474.39028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096474.39134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096474.41035: stdout chunk (state=3): >>>ansible-tmp-1727096474.3728993-27750-29169466837309=/root/.ansible/tmp/ansible-tmp-1727096474.3728993-27750-29169466837309 <<< 27712 1727096474.41185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096474.41519: stderr chunk (state=3): >>><<< 27712 1727096474.41522: stdout chunk (state=3): >>><<< 27712 1727096474.41526: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096474.3728993-27750-29169466837309=/root/.ansible/tmp/ansible-tmp-1727096474.3728993-27750-29169466837309 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096474.41528: variable 'ansible_module_compression' from source: unknown 27712 1727096474.41558: ANSIBALLZ: Using generic lock for ansible.legacy.setup 27712 1727096474.41696: ANSIBALLZ: Acquiring lock 27712 1727096474.41713: ANSIBALLZ: Lock acquired: 140297911472480 27712 1727096474.41721: ANSIBALLZ: Creating module 27712 1727096474.72719: ANSIBALLZ: Writing module into payload 27712 1727096474.72824: ANSIBALLZ: Writing module 27712 1727096474.72835: ANSIBALLZ: Renaming module 27712 1727096474.72841: ANSIBALLZ: Done creating module 27712 1727096474.72876: variable 'ansible_facts' from source: unknown 27712 1727096474.72880: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096474.72892: _low_level_execute_command(): starting 27712 1727096474.72895: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 27712 1727096474.73341: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096474.73345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096474.73347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096474.73350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096474.73406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096474.73409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096474.73411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096474.73459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096474.75397: stdout chunk (state=3): >>>PLATFORM <<< 27712 1727096474.75505: stdout chunk (state=3): >>>Linux <<< 27712 1727096474.75559: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 27712 1727096474.75755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096474.75758: stdout chunk (state=3): >>><<< 27712 1727096474.75760: stderr chunk (state=3): >>><<< 27712 1727096474.75778: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 27712 1727096474.75902 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 27712 1727096474.75906: _low_level_execute_command(): starting 27712 1727096474.75908: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 27712 1727096474.76096: Sending initial data 27712 1727096474.76099: Sent initial data (1181 bytes) 27712 1727096474.76482: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096474.76517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096474.76539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096474.76556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096474.76621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096474.81291: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 27712 1727096474.81623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096474.81627: stdout chunk (state=3): >>><<< 27712 1727096474.81773: stderr chunk (state=3): >>><<< 27712 1727096474.81776: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 27712 1727096474.81779: variable 'ansible_facts' from source: unknown 27712 1727096474.81781: variable 'ansible_facts' from source: unknown 27712 1727096474.81786: variable 'ansible_module_compression' from source: unknown 27712 1727096474.81816: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 27712 1727096474.81850: variable 'ansible_facts' from source: unknown 27712 1727096474.82082: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096474.3728993-27750-29169466837309/AnsiballZ_setup.py 27712 1727096474.82240: Sending initial data 27712 1727096474.82250: Sent initial data (153 bytes) 27712 1727096474.82824: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096474.82833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096474.82844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096474.82858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096474.82949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096474.83029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096474.83078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096474.84903: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096474.84953: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096474.85139: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpyjbs6ava /root/.ansible/tmp/ansible-tmp-1727096474.3728993-27750-29169466837309/AnsiballZ_setup.py <<< 27712 1727096474.85142: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096474.3728993-27750-29169466837309/AnsiballZ_setup.py" <<< 27712 1727096474.85196: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpyjbs6ava" to remote "/root/.ansible/tmp/ansible-tmp-1727096474.3728993-27750-29169466837309/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096474.3728993-27750-29169466837309/AnsiballZ_setup.py" <<< 27712 1727096474.86802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096474.86976: stderr chunk (state=3): >>><<< 27712 1727096474.86980: stdout chunk (state=3): >>><<< 27712 1727096474.86982: done transferring module to remote 27712 1727096474.86984: _low_level_execute_command(): starting 27712 1727096474.86987: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096474.3728993-27750-29169466837309/ /root/.ansible/tmp/ansible-tmp-1727096474.3728993-27750-29169466837309/AnsiballZ_setup.py && sleep 0' 27712 1727096474.88122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096474.88189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096474.88245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096474.88250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096474.88313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096474.91010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096474.91014: stdout chunk (state=3): >>><<< 27712 1727096474.91017: stderr chunk (state=3): >>><<< 27712 1727096474.91033: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 27712 1727096474.91040: _low_level_execute_command(): starting 27712 1727096474.91051: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096474.3728993-27750-29169466837309/AnsiballZ_setup.py && sleep 0' 27712 1727096474.91716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096474.91733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096474.91747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096474.91787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096474.91878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096474.91899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096474.91946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096474.92007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096474.95256: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 27712 1727096474.95341: stdout chunk (state=3): >>>import _imp # builtin <<< 27712 1727096474.95409: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 27712 1727096474.95614: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # <<< 27712 1727096474.95617: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 27712 1727096474.95656: stdout chunk (state=3): >>>import 'time' # <<< 27712 1727096474.95676: stdout chunk (state=3): >>>import 'zipimport' # <<< 27712 1727096474.95714: stdout chunk (state=3): >>># installed zipimport hook <<< 27712 1727096474.95764: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 27712 1727096474.95773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 27712 1727096474.95801: stdout chunk (state=3): >>>import '_codecs' # <<< 27712 1727096474.95921: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73bbc4d0> <<< 27712 1727096474.95949: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73b8bb00> <<< 27712 1727096474.95989: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 27712 1727096474.96002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73bbea50> <<< 27712 1727096474.96038: stdout chunk (state=3): >>>import '_signal' # <<< 27712 1727096474.96076: stdout chunk (state=3): >>>import '_abc' # <<< 27712 1727096474.96210: stdout chunk (state=3): >>>import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 27712 1727096474.96291: stdout chunk (state=3): >>>import '_collections_abc' # <<< 27712 1727096474.96345: stdout chunk (state=3): >>>import 'genericpath' # <<< 27712 1727096474.96348: stdout chunk (state=3): >>>import 'posixpath' # <<< 27712 1727096474.96410: stdout chunk (state=3): >>>import 'os' # <<< 27712 1727096474.96442: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 27712 1727096474.96461: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 27712 1727096474.96499: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 27712 1727096474.96523: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 27712 1727096474.96553: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 27712 1727096474.96734: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73bcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73bcdfa0> <<< 27712 1727096474.96742: stdout chunk (state=3): >>>import 'site' # <<< 27712 1727096474.96778: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 27712 1727096474.97494: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 27712 1727096474.97538: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 27712 1727096474.97595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 27712 1727096474.97653: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 27712 1727096474.97676: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 27712 1727096474.97715: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739cbda0> <<< 27712 1727096474.97727: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 27712 1727096474.97785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 27712 1727096474.97949: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739cbfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 27712 1727096474.97981: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 27712 1727096474.98033: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 27712 1727096474.98051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a037a0> <<< 27712 1727096474.98076: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 27712 1727096474.98115: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 27712 1727096474.98118: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a03e30> <<< 27712 1727096474.98150: stdout chunk (state=3): >>>import '_collections' # <<< 27712 1727096474.98211: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739e3a70> <<< 27712 1727096474.98244: stdout chunk (state=3): >>>import '_functools' # <<< 27712 1727096474.98496: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739e1190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739c8f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 27712 1727096474.98519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 27712 1727096474.98524: stdout chunk (state=3): >>>import '_sre' # <<< 27712 1727096474.98553: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 27712 1727096474.98818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a23710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a22330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739e2060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739ca810> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a587a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739c81d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 27712 1727096474.98845: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096474.98986: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73a58c50> <<< 27712 1727096474.98990: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a58b00> <<< 27712 1727096474.98992: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096474.99129: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73a58ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739c6cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a595b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a59280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 27712 1727096474.99155: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a5a4b0> import 'importlib.util' # <<< 27712 1727096474.99184: stdout chunk (state=3): >>>import 'runpy' # <<< 27712 1727096474.99231: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 27712 1727096474.99294: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 27712 1727096474.99379: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a706e0> <<< 27712 1727096474.99408: stdout chunk (state=3): >>>import 'errno' # <<< 27712 1727096474.99420: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096474.99490: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73a71df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 27712 1727096474.99518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 27712 1727096474.99537: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 27712 1727096474.99559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 27712 1727096474.99616: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a72c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73a732c0> <<< 27712 1727096474.99683: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a721b0> <<< 27712 1727096474.99705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 27712 1727096474.99733: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096474.99815: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73a73d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a73470> <<< 27712 1727096474.99833: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a5a510> <<< 27712 1727096474.99874: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 27712 1727096474.99924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 27712 1727096474.99927: stdout chunk (state=3): >>> <<< 27712 1727096474.99949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 27712 1727096474.99993: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 27712 1727096475.00216: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a7377fb90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a737a8620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737a8380> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a737a8650> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 27712 1727096475.00251: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.00451: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a737a8f80> <<< 27712 1727096475.00662: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.00674: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.00718: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a737a98b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737a8830> <<< 27712 1727096475.00721: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7377dd30> <<< 27712 1727096475.00751: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 27712 1727096475.00810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 27712 1727096475.00822: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 27712 1727096475.00856: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 27712 1727096475.00869: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737aac00> <<< 27712 1727096475.00915: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737a8e00> <<< 27712 1727096475.00940: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a5ac00> <<< 27712 1727096475.00984: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 27712 1727096475.01077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 27712 1727096475.01105: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 27712 1727096475.01320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737d6f60> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 27712 1727096475.01347: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 27712 1727096475.01396: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737fb320> <<< 27712 1727096475.01430: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 27712 1727096475.01499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 27712 1727096475.01590: stdout chunk (state=3): >>>import 'ntpath' # <<< 27712 1727096475.01641: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 27712 1727096475.01644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 27712 1727096475.01691: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73827f80> <<< 27712 1727096475.01694: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 27712 1727096475.01728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 27712 1727096475.01766: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 27712 1727096475.01833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 27712 1727096475.01975: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7385a7e0> <<< 27712 1727096475.02091: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a738581a0> <<< 27712 1727096475.02210: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737fbfb0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a731291c0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737fa120> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737abb60> <<< 27712 1727096475.02494: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 27712 1727096475.02526: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f0a737fa240> <<< 27712 1727096475.03034: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_gmrtzntz/ansible_ansible.legacy.setup_payload.zip' <<< 27712 1727096475.03051: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.03266: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.03532: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7318ae40> <<< 27712 1727096475.03535: stdout chunk (state=3): >>>import '_typing' # <<< 27712 1727096475.03832: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73169d30> <<< 27712 1727096475.03836: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73168ef0> <<< 27712 1727096475.03855: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.03899: stdout chunk (state=3): >>>import 'ansible' # <<< 27712 1727096475.03946: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.03950: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.03986: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.04000: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 27712 1727096475.04020: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.06275: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.08123: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 27712 1727096475.08171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 27712 1727096475.08176: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73188ce0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 27712 1727096475.08194: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 27712 1727096475.08239: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 27712 1727096475.08256: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 27712 1727096475.08290: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 27712 1727096475.08423: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a731c2870> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a731c2600> <<< 27712 1727096475.08444: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a731c1f10> <<< 27712 1727096475.08483: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 27712 1727096475.08500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 27712 1727096475.08553: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a731c2660> <<< 27712 1727096475.08591: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7318b860> import 'atexit' # <<< 27712 1727096475.08618: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.08715: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a731c35c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a731c3800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 27712 1727096475.08816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 27712 1727096475.08854: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a731c3d40> <<< 27712 1727096475.08870: stdout chunk (state=3): >>>import 'pwd' # <<< 27712 1727096475.08903: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 27712 1727096475.08938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 27712 1727096475.08997: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73025a00> <<< 27712 1727096475.09211: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73027680> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73027f80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7302cf80> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 27712 1727096475.09264: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 27712 1727096475.09295: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 27712 1727096475.09316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 27712 1727096475.09394: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7302fce0> <<< 27712 1727096475.09474: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.09498: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a737d6ed0> <<< 27712 1727096475.09511: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7302dfa0> <<< 27712 1727096475.09538: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 27712 1727096475.09600: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 27712 1727096475.09621: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 27712 1727096475.09633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 27712 1727096475.09660: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 27712 1727096475.09826: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 27712 1727096475.09865: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 27712 1727096475.09895: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73033c20> <<< 27712 1727096475.10110: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73032720> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73032480> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 27712 1727096475.10156: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a730329c0> <<< 27712 1727096475.10199: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7302e4b0> <<< 27712 1727096475.10257: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.10278: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73077e00> <<< 27712 1727096475.10317: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 27712 1727096475.10327: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73077800> <<< 27712 1727096475.10381: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 27712 1727096475.10400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 27712 1727096475.10435: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 27712 1727096475.10445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 27712 1727096475.10512: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.10535: stdout chunk (state=3): >>>import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73079a30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a730797f0> <<< 27712 1727096475.10557: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 27712 1727096475.10616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 27712 1727096475.10666: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.10814: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a7307bfb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7307a120> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 27712 1727096475.10866: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7307f5f0> <<< 27712 1727096475.11076: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7307bf20> <<< 27712 1727096475.11191: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.11195: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.11231: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73080740> <<< 27712 1727096475.11254: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.11270: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a730808c0> <<< 27712 1727096475.11332: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73080950> <<< 27712 1727096475.11362: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73078170> <<< 27712 1727096475.11392: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 27712 1727096475.11508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73083fe0> <<< 27712 1727096475.11750: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.11793: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72f0d250> <<< 27712 1727096475.11796: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a730827e0> <<< 27712 1727096475.11857: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.11884: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73083b90> <<< 27712 1727096475.11901: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73082420> # zipimport: zlib available <<< 27712 1727096475.11916: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.12007: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 27712 1727096475.12080: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.12216: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.12248: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.12282: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 27712 1727096475.12331: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.12334: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.12345: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 27712 1727096475.12512: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.12543: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.12750: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.13641: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.14565: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 27712 1727096475.14586: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 27712 1727096475.14621: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 27712 1727096475.14656: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 27712 1727096475.14692: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 27712 1727096475.14784: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.14787: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.14802: stdout chunk (state=3): >>>import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72f11460> <<< 27712 1727096475.15011: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72f12240> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a730804a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 27712 1727096475.15051: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.15081: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 27712 1727096475.15098: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.15336: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.15589: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 27712 1727096475.15639: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 27712 1727096475.15643: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72f122d0> # zipimport: zlib available <<< 27712 1727096475.16438: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.16805: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.16822: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.16922: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 27712 1727096475.17007: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 27712 1727096475.17050: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.17164: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 27712 1727096475.17182: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 27712 1727096475.17229: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.17264: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 27712 1727096475.17283: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.17496: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.17826: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 27712 1727096475.17921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 27712 1727096475.17924: stdout chunk (state=3): >>>import '_ast' # <<< 27712 1727096475.18017: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72f13380> <<< 27712 1727096475.18038: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.18127: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.18233: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 27712 1727096475.18265: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 27712 1727096475.18289: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 27712 1727096475.18323: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.18396: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 27712 1727096475.18496: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27712 1727096475.18663: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 27712 1727096475.18727: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 27712 1727096475.18850: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72f1dd00> <<< 27712 1727096475.18938: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72f18d70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 27712 1727096475.19031: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.19152: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27712 1727096475.19211: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 27712 1727096475.19297: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 27712 1727096475.19318: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 27712 1727096475.19433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 27712 1727096475.19436: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 27712 1727096475.19523: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73006690> <<< 27712 1727096475.19631: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a730fe360> <<< 27712 1727096475.19688: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72f1dcd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72f132f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 27712 1727096475.19756: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27712 1727096475.19773: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 27712 1727096475.19876: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 27712 1727096475.19895: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 27712 1727096475.20012: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.20053: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.20100: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27712 1727096475.20150: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.20219: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.20315: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # <<< 27712 1727096475.20328: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.20429: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.20531: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.20600: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.20632: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 27712 1727096475.20824: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.21098: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27712 1727096475.21129: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 27712 1727096475.21193: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 27712 1727096475.21236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72fb1af0> <<< 27712 1727096475.21256: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 27712 1727096475.21275: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 27712 1727096475.21391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 27712 1727096475.21410: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72bafad0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72bafe30> <<< 27712 1727096475.21638: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72fb3050> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72fb2630> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72fb01d0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72fb0bc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 27712 1727096475.21647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 27712 1727096475.21684: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72bc6ed0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72bc6780> <<< 27712 1727096475.21714: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.21750: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72bc6960> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72bc5bb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 27712 1727096475.21902: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 27712 1727096475.21907: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72bc70b0> <<< 27712 1727096475.21938: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 27712 1727096475.22018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 27712 1727096475.22076: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72c1dbe0> <<< 27712 1727096475.22079: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72bc7bc0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72fb03e0> import 'ansible.module_utils.facts.timeout' # <<< 27712 1727096475.22181: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 27712 1727096475.22220: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 27712 1727096475.22266: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.22299: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.22361: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 27712 1727096475.22512: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27712 1727096475.22549: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 27712 1727096475.22585: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.22634: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 27712 1727096475.22694: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.22755: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.23022: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 27712 1727096475.23664: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.24430: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 27712 1727096475.24524: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.24552: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.24595: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.24648: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 27712 1727096475.24652: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 27712 1727096475.24698: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27712 1727096475.24735: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 27712 1727096475.24815: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.24914: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 27712 1727096475.24944: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.24984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 27712 1727096475.25008: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.25076: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 27712 1727096475.25087: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.25191: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.25315: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 27712 1727096475.25364: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72c1e5a0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 27712 1727096475.25401: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 27712 1727096475.25597: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72c1e780> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 27712 1727096475.25711: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.25791: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 27712 1727096475.26196: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 27712 1727096475.26287: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.26437: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 27712 1727096475.26440: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.26534: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.26648: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 27712 1727096475.26716: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.26774: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72c5de80> <<< 27712 1727096475.26975: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7316b6e0> import 'ansible.module_utils.facts.system.python' # <<< 27712 1727096475.26992: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.27081: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.27107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 27712 1727096475.27196: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.27278: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.27397: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.27534: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 27712 1727096475.27550: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.27589: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.27639: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 27712 1727096475.27654: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.27677: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.27737: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 27712 1727096475.27780: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096475.27848: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72c65a30> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72c42ea0> import 'ansible.module_utils.facts.system.user' # <<< 27712 1727096475.27973: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 27712 1727096475.27976: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.27991: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 27712 1727096475.28095: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.28243: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 27712 1727096475.28285: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.28353: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.28519: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.28528: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.28594: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 27712 1727096475.28618: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.28635: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27712 1727096475.28778: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.28891: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 27712 1727096475.28961: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 27712 1727096475.29073: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.29201: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 27712 1727096475.29226: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27712 1727096475.29810: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.30324: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 27712 1727096475.30449: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 27712 1727096475.30452: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.30541: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 27712 1727096475.30566: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.30642: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.30747: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 27712 1727096475.30875: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.30987: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.31092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 27712 1727096475.31100: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27712 1727096475.31234: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 27712 1727096475.31294: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.31391: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.31601: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.31806: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 27712 1727096475.31892: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 27712 1727096475.31921: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 27712 1727096475.31989: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 27712 1727096475.32099: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.32127: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available <<< 27712 1727096475.32231: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 27712 1727096475.32242: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.32287: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 27712 1727096475.32299: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.32348: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.32438: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 27712 1727096475.32444: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.32684: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.32969: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 27712 1727096475.33002: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.33108: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 27712 1727096475.33121: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.33148: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 27712 1727096475.33223: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.33245: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 27712 1727096475.33325: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.33328: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 27712 1727096475.33376: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.33461: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 27712 1727096475.33585: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 27712 1727096475.33589: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.33607: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 27712 1727096475.33733: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 27712 1727096475.33748: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.33802: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.34073: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 27712 1727096475.34089: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.34114: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 27712 1727096475.34194: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.34382: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 27712 1727096475.34398: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.34441: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.34483: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 27712 1727096475.34505: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.34543: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.34586: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 27712 1727096475.34689: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.34877: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 27712 1727096475.34881: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 27712 1727096475.34897: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.34940: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 27712 1727096475.35093: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096475.35910: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 27712 1727096475.35946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 27712 1727096475.35977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 27712 1727096475.36088: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72a62f00> <<< 27712 1727096475.36092: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72a601a0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72a63a10> <<< 27712 1727096475.48906: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 27712 1727096475.48953: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 27712 1727096475.49023: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72aaacf0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 27712 1727096475.49059: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72aa8e30> <<< 27712 1727096475.49098: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 27712 1727096475.49147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 27712 1727096475.49174: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72aab140> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72aa9e20> <<< 27712 1727096475.49707: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 27712 1727096475.74311: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "01", "second": "15", "epoch": "1727096475", "epoch_int": "1727096475", "date": "2024-09-23", "time": "09:01:15", "iso8601_micro": "2024-09-23T13:01:15.351355Z", "iso8601": "2024-09-23T13:01:15Z", "iso8601_basic": "20240923T090115351355", "iso8601_basic_short": "20240923T090115", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28<<< 27712 1727096475.74375: stdout chunk (state=3): >>>dde2945b45c603c07d1816f189ea", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.4208984375, "5m": 0.46533203125, "15m": 0.2841796875}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2940, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 591, "free": 2940}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_uuid": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 617, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794787328, "block_size": 4096, "block_total": 65519099, "block_available": 63914743, "block_used": 1604356, "inode_total": 131070960, "inode_available": 131029094, "inode_used": 41866, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]",<<< 27712 1727096475.74424: stdout chunk (state=3): >>> "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ceff:fe61:4d8f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.126"], "ansible_all_ipv6_addresses": ["fe80::8ff:ceff:fe61:4d8f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.126", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ceff:fe61:4d8f"]}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 27712 1727096475.75354: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanu<<< 27712 1727096475.75428: stdout chunk (state=3): >>>p[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout <<< 27712 1727096475.75495: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base <<< 27712 1727096475.75508: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot <<< 27712 1727096475.75555: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution <<< 27712 1727096475.75793: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg <<< 27712 1727096475.75807: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 27712 1727096475.76093: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 27712 1727096475.76104: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 27712 1727096475.76137: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 27712 1727096475.76263: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 27712 1727096475.76282: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 27712 1727096475.76316: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 27712 1727096475.76355: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 27712 1727096475.76458: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 27712 1727096475.76494: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 27712 1727096475.76562: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 27712 1727096475.76854: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 27712 1727096475.77104: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket <<< 27712 1727096475.77226: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error <<< 27712 1727096475.77260: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 27712 1727096475.77285: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 27712 1727096475.77385: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 27712 1727096475.77434: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 27712 1727096475.77552: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 27712 1727096475.78256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096475.78260: stdout chunk (state=3): >>><<< 27712 1727096475.78262: stderr chunk (state=3): >>><<< 27712 1727096475.78819: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73bbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73b8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73bbea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73bcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73bcdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739cbda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739cbfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a037a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a03e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739e3a70> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739e1190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739c8f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a23710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a22330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739e2060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739ca810> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a587a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739c81d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73a58c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a58b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73a58ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a739c6cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a595b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a59280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a5a4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a706e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73a71df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a72c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73a732c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a721b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73a73d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a73470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a5a510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a7377fb90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a737a8620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737a8380> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a737a8650> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a737a8f80> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a737a98b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737a8830> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7377dd30> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737aac00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737a8e00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73a5ac00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737d6f60> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737fb320> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73827f80> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7385a7e0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a738581a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737fbfb0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a731291c0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737fa120> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a737abb60> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f0a737fa240> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_gmrtzntz/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7318ae40> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73169d30> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73168ef0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73188ce0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a731c2870> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a731c2600> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a731c1f10> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a731c2660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7318b860> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a731c35c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a731c3800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a731c3d40> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73025a00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73027680> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73027f80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7302cf80> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7302fce0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a737d6ed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7302dfa0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73033c20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73032720> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73032480> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a730329c0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7302e4b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73077e00> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73077800> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73079a30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a730797f0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a7307bfb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7307a120> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7307f5f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7307bf20> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73080740> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a730808c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73080950> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73078170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73083fe0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72f0d250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a730827e0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a73083b90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73082420> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72f11460> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72f12240> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a730804a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72f122d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72f13380> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72f1dd00> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72f18d70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a73006690> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a730fe360> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72f1dcd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72f132f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72fb1af0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72bafad0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72bafe30> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72fb3050> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72fb2630> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72fb01d0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72fb0bc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72bc6ed0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72bc6780> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72bc6960> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72bc5bb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72bc70b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72c1dbe0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72bc7bc0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72fb03e0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72c1e5a0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72c1e780> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72c5de80> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a7316b6e0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72c65a30> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72c42ea0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0a72a62f00> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72a601a0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72a63a10> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72aaacf0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72aa8e30> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72aab140> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0a72aa9e20> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "01", "second": "15", "epoch": "1727096475", "epoch_int": "1727096475", "date": "2024-09-23", "time": "09:01:15", "iso8601_micro": "2024-09-23T13:01:15.351355Z", "iso8601": "2024-09-23T13:01:15Z", "iso8601_basic": "20240923T090115351355", "iso8601_basic_short": "20240923T090115", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.4208984375, "5m": 0.46533203125, "15m": 0.2841796875}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2940, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 591, "free": 2940}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_uuid": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 617, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794787328, "block_size": 4096, "block_total": 65519099, "block_available": 63914743, "block_used": 1604356, "inode_total": 131070960, "inode_available": 131029094, "inode_used": 41866, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ceff:fe61:4d8f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.126"], "ansible_all_ipv6_addresses": ["fe80::8ff:ceff:fe61:4d8f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.126", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ceff:fe61:4d8f"]}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 27712 1727096475.83230: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096474.3728993-27750-29169466837309/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096475.83342: _low_level_execute_command(): starting 27712 1727096475.83359: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096474.3728993-27750-29169466837309/ > /dev/null 2>&1 && sleep 0' 27712 1727096475.84770: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096475.84864: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096475.84986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096475.85072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096475.85110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096475.87501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096475.87506: stdout chunk (state=3): >>><<< 27712 1727096475.87509: stderr chunk (state=3): >>><<< 27712 1727096475.87512: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 27712 1727096475.87514: handler run complete 27712 1727096475.87764: variable 'ansible_facts' from source: unknown 27712 1727096475.88281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096475.88694: variable 'ansible_facts' from source: unknown 27712 1727096475.88827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096475.89184: attempt loop complete, returning result 27712 1727096475.89194: _execute() done 27712 1727096475.89384: dumping result to json 27712 1727096475.89387: done dumping result, returning 27712 1727096475.89390: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0afff68d-5257-cbc7-8716-0000000000bf] 27712 1727096475.89392: sending task result for task 0afff68d-5257-cbc7-8716-0000000000bf 27712 1727096475.90781: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000bf 27712 1727096475.90785: WORKER PROCESS EXITING ok: [managed_node2] 27712 1727096475.91388: no more pending results, returning what we have 27712 1727096475.91391: results queue empty 27712 1727096475.91392: checking for any_errors_fatal 27712 1727096475.91393: done checking for any_errors_fatal 27712 1727096475.91394: checking for max_fail_percentage 27712 1727096475.91395: done checking for max_fail_percentage 27712 1727096475.91396: checking to see if all hosts have failed and the running result is not ok 27712 1727096475.91397: done checking to see if all hosts have failed 27712 1727096475.91398: getting the remaining hosts for this loop 27712 1727096475.91399: done getting the remaining hosts for this loop 27712 1727096475.91403: getting the next task for host managed_node2 27712 1727096475.91409: done getting next task for host managed_node2 27712 1727096475.91411: ^ task is: TASK: meta (flush_handlers) 27712 1727096475.91413: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096475.91416: getting variables 27712 1727096475.91532: in VariableManager get_vars() 27712 1727096475.91555: Calling all_inventory to load vars for managed_node2 27712 1727096475.91558: Calling groups_inventory to load vars for managed_node2 27712 1727096475.91561: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096475.91573: Calling all_plugins_play to load vars for managed_node2 27712 1727096475.91576: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096475.91579: Calling groups_plugins_play to load vars for managed_node2 27712 1727096475.91996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096475.92431: done with get_vars() 27712 1727096475.92442: done getting variables 27712 1727096475.92573: in VariableManager get_vars() 27712 1727096475.92584: Calling all_inventory to load vars for managed_node2 27712 1727096475.92586: Calling groups_inventory to load vars for managed_node2 27712 1727096475.92589: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096475.92594: Calling all_plugins_play to load vars for managed_node2 27712 1727096475.92596: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096475.92599: Calling groups_plugins_play to load vars for managed_node2 27712 1727096475.92973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096475.93390: done with get_vars() 27712 1727096475.93405: done queuing things up, now waiting for results queue to drain 27712 1727096475.93407: results queue empty 27712 1727096475.93408: checking for any_errors_fatal 27712 1727096475.93410: done checking for any_errors_fatal 27712 1727096475.93411: checking for max_fail_percentage 27712 1727096475.93412: done checking for max_fail_percentage 27712 1727096475.93413: checking to see if all hosts have failed and the running result is not ok 27712 1727096475.93413: done checking to see if all hosts have failed 27712 1727096475.93419: getting the remaining hosts for this loop 27712 1727096475.93420: done getting the remaining hosts for this loop 27712 1727096475.93423: getting the next task for host managed_node2 27712 1727096475.93427: done getting next task for host managed_node2 27712 1727096475.93430: ^ task is: TASK: Include the task 'el_repo_setup.yml' 27712 1727096475.93431: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096475.93433: getting variables 27712 1727096475.93434: in VariableManager get_vars() 27712 1727096475.93442: Calling all_inventory to load vars for managed_node2 27712 1727096475.93444: Calling groups_inventory to load vars for managed_node2 27712 1727096475.93446: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096475.93451: Calling all_plugins_play to load vars for managed_node2 27712 1727096475.93453: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096475.93455: Calling groups_plugins_play to load vars for managed_node2 27712 1727096475.93791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096475.94009: done with get_vars() 27712 1727096475.94017: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:11 Monday 23 September 2024 09:01:15 -0400 (0:00:01.622) 0:00:01.634 ****** 27712 1727096475.94093: entering _queue_task() for managed_node2/include_tasks 27712 1727096475.94095: Creating lock for include_tasks 27712 1727096475.95146: worker is 1 (out of 1 available) 27712 1727096475.95157: exiting _queue_task() for managed_node2/include_tasks 27712 1727096475.95170: done queuing things up, now waiting for results queue to drain 27712 1727096475.95171: waiting for pending results... 27712 1727096475.95473: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 27712 1727096475.95550: in run() - task 0afff68d-5257-cbc7-8716-000000000006 27712 1727096475.95607: variable 'ansible_search_path' from source: unknown 27712 1727096475.95714: calling self._execute() 27712 1727096475.95891: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096475.95897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096475.95915: variable 'omit' from source: magic vars 27712 1727096475.96216: _execute() done 27712 1727096475.96220: dumping result to json 27712 1727096475.96222: done dumping result, returning 27712 1727096475.96224: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0afff68d-5257-cbc7-8716-000000000006] 27712 1727096475.96226: sending task result for task 0afff68d-5257-cbc7-8716-000000000006 27712 1727096475.96414: no more pending results, returning what we have 27712 1727096475.96419: in VariableManager get_vars() 27712 1727096475.96457: Calling all_inventory to load vars for managed_node2 27712 1727096475.96460: Calling groups_inventory to load vars for managed_node2 27712 1727096475.96464: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096475.96480: Calling all_plugins_play to load vars for managed_node2 27712 1727096475.96484: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096475.96487: Calling groups_plugins_play to load vars for managed_node2 27712 1727096475.97084: done sending task result for task 0afff68d-5257-cbc7-8716-000000000006 27712 1727096475.97087: WORKER PROCESS EXITING 27712 1727096475.97111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096475.97657: done with get_vars() 27712 1727096475.97665: variable 'ansible_search_path' from source: unknown 27712 1727096475.97683: we have included files to process 27712 1727096475.97684: generating all_blocks data 27712 1727096475.97685: done generating all_blocks data 27712 1727096475.97686: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 27712 1727096475.97687: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 27712 1727096475.97690: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 27712 1727096475.99101: in VariableManager get_vars() 27712 1727096475.99120: done with get_vars() 27712 1727096475.99131: done processing included file 27712 1727096475.99133: iterating over new_blocks loaded from include file 27712 1727096475.99135: in VariableManager get_vars() 27712 1727096475.99260: done with get_vars() 27712 1727096475.99262: filtering new block on tags 27712 1727096475.99280: done filtering new block on tags 27712 1727096475.99284: in VariableManager get_vars() 27712 1727096475.99295: done with get_vars() 27712 1727096475.99297: filtering new block on tags 27712 1727096475.99312: done filtering new block on tags 27712 1727096475.99314: in VariableManager get_vars() 27712 1727096475.99325: done with get_vars() 27712 1727096475.99326: filtering new block on tags 27712 1727096475.99338: done filtering new block on tags 27712 1727096475.99340: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 27712 1727096475.99347: extending task lists for all hosts with included blocks 27712 1727096475.99512: done extending task lists 27712 1727096475.99514: done processing included files 27712 1727096475.99515: results queue empty 27712 1727096475.99515: checking for any_errors_fatal 27712 1727096475.99517: done checking for any_errors_fatal 27712 1727096475.99517: checking for max_fail_percentage 27712 1727096475.99519: done checking for max_fail_percentage 27712 1727096475.99519: checking to see if all hosts have failed and the running result is not ok 27712 1727096475.99520: done checking to see if all hosts have failed 27712 1727096475.99521: getting the remaining hosts for this loop 27712 1727096475.99522: done getting the remaining hosts for this loop 27712 1727096475.99524: getting the next task for host managed_node2 27712 1727096475.99528: done getting next task for host managed_node2 27712 1727096475.99530: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 27712 1727096475.99532: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096475.99534: getting variables 27712 1727096475.99535: in VariableManager get_vars() 27712 1727096475.99544: Calling all_inventory to load vars for managed_node2 27712 1727096475.99547: Calling groups_inventory to load vars for managed_node2 27712 1727096475.99549: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096475.99555: Calling all_plugins_play to load vars for managed_node2 27712 1727096475.99557: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096475.99560: Calling groups_plugins_play to load vars for managed_node2 27712 1727096476.00182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096476.00532: done with get_vars() 27712 1727096476.00542: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Monday 23 September 2024 09:01:16 -0400 (0:00:00.066) 0:00:01.700 ****** 27712 1727096476.00733: entering _queue_task() for managed_node2/setup 27712 1727096476.01456: worker is 1 (out of 1 available) 27712 1727096476.01469: exiting _queue_task() for managed_node2/setup 27712 1727096476.01479: done queuing things up, now waiting for results queue to drain 27712 1727096476.01480: waiting for pending results... 27712 1727096476.02199: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 27712 1727096476.02399: in run() - task 0afff68d-5257-cbc7-8716-0000000000d0 27712 1727096476.02629: variable 'ansible_search_path' from source: unknown 27712 1727096476.02632: variable 'ansible_search_path' from source: unknown 27712 1727096476.02635: calling self._execute() 27712 1727096476.02800: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096476.02855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096476.03079: variable 'omit' from source: magic vars 27712 1727096476.04869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096476.09099: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096476.09227: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096476.09315: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096476.09480: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096476.09515: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096476.09644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096476.09753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096476.09807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096476.10042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096476.10045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096476.10381: variable 'ansible_facts' from source: unknown 27712 1727096476.10457: variable 'network_test_required_facts' from source: task vars 27712 1727096476.10529: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): False 27712 1727096476.10588: when evaluation is False, skipping this task 27712 1727096476.10596: _execute() done 27712 1727096476.10608: dumping result to json 27712 1727096476.10616: done dumping result, returning 27712 1727096476.10627: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0afff68d-5257-cbc7-8716-0000000000d0] 27712 1727096476.10636: sending task result for task 0afff68d-5257-cbc7-8716-0000000000d0 27712 1727096476.10921: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000d0 27712 1727096476.10928: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts", "skip_reason": "Conditional result was False" } 27712 1727096476.10997: no more pending results, returning what we have 27712 1727096476.11000: results queue empty 27712 1727096476.11001: checking for any_errors_fatal 27712 1727096476.11003: done checking for any_errors_fatal 27712 1727096476.11003: checking for max_fail_percentage 27712 1727096476.11005: done checking for max_fail_percentage 27712 1727096476.11006: checking to see if all hosts have failed and the running result is not ok 27712 1727096476.11007: done checking to see if all hosts have failed 27712 1727096476.11007: getting the remaining hosts for this loop 27712 1727096476.11009: done getting the remaining hosts for this loop 27712 1727096476.11041: getting the next task for host managed_node2 27712 1727096476.11052: done getting next task for host managed_node2 27712 1727096476.11055: ^ task is: TASK: Check if system is ostree 27712 1727096476.11059: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096476.11062: getting variables 27712 1727096476.11064: in VariableManager get_vars() 27712 1727096476.11098: Calling all_inventory to load vars for managed_node2 27712 1727096476.11101: Calling groups_inventory to load vars for managed_node2 27712 1727096476.11104: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096476.11115: Calling all_plugins_play to load vars for managed_node2 27712 1727096476.11117: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096476.11577: Calling groups_plugins_play to load vars for managed_node2 27712 1727096476.11750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096476.12338: done with get_vars() 27712 1727096476.12348: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Monday 23 September 2024 09:01:16 -0400 (0:00:00.118) 0:00:01.818 ****** 27712 1727096476.12558: entering _queue_task() for managed_node2/stat 27712 1727096476.13166: worker is 1 (out of 1 available) 27712 1727096476.13313: exiting _queue_task() for managed_node2/stat 27712 1727096476.13322: done queuing things up, now waiting for results queue to drain 27712 1727096476.13323: waiting for pending results... 27712 1727096476.13809: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 27712 1727096476.13998: in run() - task 0afff68d-5257-cbc7-8716-0000000000d2 27712 1727096476.14074: variable 'ansible_search_path' from source: unknown 27712 1727096476.14078: variable 'ansible_search_path' from source: unknown 27712 1727096476.14081: calling self._execute() 27712 1727096476.14369: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096476.14380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096476.14383: variable 'omit' from source: magic vars 27712 1727096476.15292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096476.15761: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096476.16028: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096476.16032: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096476.16080: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096476.16287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096476.16317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096476.16348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096476.16498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096476.16660: Evaluated conditional (not __network_is_ostree is defined): True 27712 1727096476.16725: variable 'omit' from source: magic vars 27712 1727096476.16765: variable 'omit' from source: magic vars 27712 1727096476.16862: variable 'omit' from source: magic vars 27712 1727096476.16976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096476.17048: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096476.17150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096476.17259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096476.17262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096476.17285: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096476.17294: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096476.17341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096476.17559: Set connection var ansible_connection to ssh 27712 1727096476.17582: Set connection var ansible_pipelining to False 27712 1727096476.17593: Set connection var ansible_timeout to 10 27712 1727096476.17599: Set connection var ansible_shell_type to sh 27712 1727096476.17610: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096476.17693: Set connection var ansible_shell_executable to /bin/sh 27712 1727096476.17706: variable 'ansible_shell_executable' from source: unknown 27712 1727096476.17713: variable 'ansible_connection' from source: unknown 27712 1727096476.17719: variable 'ansible_module_compression' from source: unknown 27712 1727096476.17877: variable 'ansible_shell_type' from source: unknown 27712 1727096476.17879: variable 'ansible_shell_executable' from source: unknown 27712 1727096476.17881: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096476.17883: variable 'ansible_pipelining' from source: unknown 27712 1727096476.17885: variable 'ansible_timeout' from source: unknown 27712 1727096476.17887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096476.18128: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096476.18132: variable 'omit' from source: magic vars 27712 1727096476.18134: starting attempt loop 27712 1727096476.18136: running the handler 27712 1727096476.18137: _low_level_execute_command(): starting 27712 1727096476.18206: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096476.19601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096476.19781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096476.19843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096476.19864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096476.19890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096476.20330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096476.22065: stdout chunk (state=3): >>>/root <<< 27712 1727096476.22192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096476.22208: stderr chunk (state=3): >>><<< 27712 1727096476.22301: stdout chunk (state=3): >>><<< 27712 1727096476.22405: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096476.22417: _low_level_execute_command(): starting 27712 1727096476.22420: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096476.2233047-27849-195992427422335 `" && echo ansible-tmp-1727096476.2233047-27849-195992427422335="` echo /root/.ansible/tmp/ansible-tmp-1727096476.2233047-27849-195992427422335 `" ) && sleep 0' 27712 1727096476.23681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096476.23878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096476.23975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096476.24112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096476.26378: stdout chunk (state=3): >>>ansible-tmp-1727096476.2233047-27849-195992427422335=/root/.ansible/tmp/ansible-tmp-1727096476.2233047-27849-195992427422335 <<< 27712 1727096476.26431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096476.26441: stdout chunk (state=3): >>><<< 27712 1727096476.26457: stderr chunk (state=3): >>><<< 27712 1727096476.26675: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096476.2233047-27849-195992427422335=/root/.ansible/tmp/ansible-tmp-1727096476.2233047-27849-195992427422335 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096476.26679: variable 'ansible_module_compression' from source: unknown 27712 1727096476.26857: ANSIBALLZ: Using lock for stat 27712 1727096476.26869: ANSIBALLZ: Acquiring lock 27712 1727096476.26879: ANSIBALLZ: Lock acquired: 140297911474112 27712 1727096476.27033: ANSIBALLZ: Creating module 27712 1727096476.56009: ANSIBALLZ: Writing module into payload 27712 1727096476.56126: ANSIBALLZ: Writing module 27712 1727096476.56154: ANSIBALLZ: Renaming module 27712 1727096476.56166: ANSIBALLZ: Done creating module 27712 1727096476.56259: variable 'ansible_facts' from source: unknown 27712 1727096476.56356: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096476.2233047-27849-195992427422335/AnsiballZ_stat.py 27712 1727096476.56527: Sending initial data 27712 1727096476.56635: Sent initial data (153 bytes) 27712 1727096476.57258: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096476.57300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096476.57313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096476.57419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096476.57465: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096476.57544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096476.59802: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096476.59844: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096476.60019: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpg3rmlfkw /root/.ansible/tmp/ansible-tmp-1727096476.2233047-27849-195992427422335/AnsiballZ_stat.py <<< 27712 1727096476.60022: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096476.2233047-27849-195992427422335/AnsiballZ_stat.py" <<< 27712 1727096476.60486: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpg3rmlfkw" to remote "/root/.ansible/tmp/ansible-tmp-1727096476.2233047-27849-195992427422335/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096476.2233047-27849-195992427422335/AnsiballZ_stat.py" <<< 27712 1727096476.62248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096476.62309: stderr chunk (state=3): >>><<< 27712 1727096476.62361: stdout chunk (state=3): >>><<< 27712 1727096476.62449: done transferring module to remote 27712 1727096476.62615: _low_level_execute_command(): starting 27712 1727096476.62652: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096476.2233047-27849-195992427422335/ /root/.ansible/tmp/ansible-tmp-1727096476.2233047-27849-195992427422335/AnsiballZ_stat.py && sleep 0' 27712 1727096476.64047: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096476.64197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096476.64248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096476.64251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096476.64254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096476.64354: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096476.64357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096476.64360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096476.64362: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096476.64366: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27712 1727096476.64369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096476.64371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096476.64373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096476.64375: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096476.64377: stderr chunk (state=3): >>>debug2: match found <<< 27712 1727096476.64379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096476.64609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096476.64612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096476.64684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096476.67296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096476.67535: stderr chunk (state=3): >>><<< 27712 1727096476.67538: stdout chunk (state=3): >>><<< 27712 1727096476.67556: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 27712 1727096476.67559: _low_level_execute_command(): starting 27712 1727096476.67571: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096476.2233047-27849-195992427422335/AnsiballZ_stat.py && sleep 0' 27712 1727096476.68951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096476.68985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096476.69202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096476.69295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096476.69430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096476.72731: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 27712 1727096476.72795: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 27712 1727096476.72859: stdout chunk (state=3): >>>import 'posix' # <<< 27712 1727096476.72894: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 27712 1727096476.72950: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 27712 1727096476.73003: stdout chunk (state=3): >>>import 'codecs' # <<< 27712 1727096476.73158: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6d084d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6cd7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6d0aa50> <<< 27712 1727096476.73190: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # <<< 27712 1727096476.73307: stdout chunk (state=3): >>>import 'io' # <<< 27712 1727096476.73396: stdout chunk (state=3): >>>import '_stat' # import 'stat' # import '_collections_abc' # <<< 27712 1727096476.73440: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 27712 1727096476.73568: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 27712 1727096476.73617: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6abd130> <<< 27712 1727096476.73714: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6abdfa0> <<< 27712 1727096476.73796: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 27712 1727096476.74213: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 27712 1727096476.74335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 27712 1727096476.74429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6afbe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 27712 1727096476.74451: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6afbf50> <<< 27712 1727096476.74554: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 27712 1727096476.74661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 27712 1727096476.74711: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b33830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 27712 1727096476.74953: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b33ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b13b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b11280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6af9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 27712 1727096476.75073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 27712 1727096476.75130: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 27712 1727096476.75212: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b537d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b523f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b12150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b50c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 27712 1727096476.75273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b88860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6af82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 27712 1727096476.75344: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6b88d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b88bc0> <<< 27712 1727096476.75430: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6b88f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6af6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 27712 1727096476.75456: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 27712 1727096476.75509: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b89610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b892e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 27712 1727096476.75576: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b8a510> import 'importlib.util' # <<< 27712 1727096476.75599: stdout chunk (state=3): >>>import 'runpy' # <<< 27712 1727096476.75639: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 27712 1727096476.75664: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6ba0710> <<< 27712 1727096476.75726: stdout chunk (state=3): >>>import 'errno' # <<< 27712 1727096476.75811: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6ba1df0> <<< 27712 1727096476.75929: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6ba2c90> <<< 27712 1727096476.75953: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6ba32f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6ba21e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 27712 1727096476.76020: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6ba3d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6ba34a0> <<< 27712 1727096476.76132: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b8a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 27712 1727096476.76135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 27712 1727096476.76258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a691fbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 27712 1727096476.76320: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a69486e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6948440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6948710> <<< 27712 1727096476.76403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 27712 1727096476.76459: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096476.76631: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6949040> <<< 27712 1727096476.76797: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a69499a0> <<< 27712 1727096476.76801: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a69488f0> <<< 27712 1727096476.76926: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a691dd90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 27712 1727096476.76958: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a694adb0> <<< 27712 1727096476.77114: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6949af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b8ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 27712 1727096476.77188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 27712 1727096476.77376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6977110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 27712 1727096476.77530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a69974a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 27712 1727096476.77540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 27712 1727096476.77905: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a69f8260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 27712 1727096476.77909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a69fa9c0> <<< 27712 1727096476.77928: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a69f8380> <<< 27712 1727096476.77969: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a69c1280> <<< 27712 1727096476.78003: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6329340> <<< 27712 1727096476.78109: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a69962a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a694bce0> <<< 27712 1727096476.78185: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 27712 1727096476.78208: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa9a63295b0> <<< 27712 1727096476.78497: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_p2fn0sj4/ansible_stat_payload.zip' # zipimport: zlib available <<< 27712 1727096476.78716: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 27712 1727096476.78832: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 27712 1727096476.78932: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a637f0b0> import '_typing' # <<< 27712 1727096476.79238: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a635dfa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a635d160> # zipimport: zlib available <<< 27712 1727096476.79342: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 27712 1727096476.79407: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.81585: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.83566: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a637cf80> <<< 27712 1727096476.83812: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 27712 1727096476.83905: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a63a69c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a63a6750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a63a6060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a63a67e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a637fd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a63a7740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a63a7980> <<< 27712 1727096476.84001: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 27712 1727096476.84005: stdout chunk (state=3): >>>import '_locale' # <<< 27712 1727096476.84140: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a63a7ec0> import 'pwd' # <<< 27712 1727096476.84203: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6211ca0> <<< 27712 1727096476.84260: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a62138c0> <<< 27712 1727096476.84365: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 27712 1727096476.84446: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a62142c0> <<< 27712 1727096476.84462: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6215460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 27712 1727096476.84542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 27712 1727096476.84564: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 27712 1727096476.84603: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6217f20> <<< 27712 1727096476.84726: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a69482f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a62161e0> <<< 27712 1727096476.84754: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 27712 1727096476.84790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 27712 1727096476.84808: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 27712 1727096476.84980: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a621fce0> <<< 27712 1727096476.85011: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a621e7b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a621e510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 27712 1727096476.85115: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6215010> <<< 27712 1727096476.85200: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a62166f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6267f50> <<< 27712 1727096476.85260: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a62680e0> <<< 27712 1727096476.85384: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 27712 1727096476.85463: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6269bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6269970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 27712 1727096476.85670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a626c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a626a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 27712 1727096476.85750: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 27712 1727096476.85775: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 27712 1727096476.85791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 27712 1727096476.85986: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a626f830> <<< 27712 1727096476.86146: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a626c200> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a62705f0> <<< 27712 1727096476.86220: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6270830> <<< 27712 1727096476.86429: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6270b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a62682c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 27712 1727096476.86437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 27712 1727096476.86503: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a62fc1d0> <<< 27712 1727096476.86731: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a62fd400> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6272990> <<< 27712 1727096476.86786: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6273d10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a62725d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 27712 1727096476.86936: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.87053: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.87108: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 27712 1727096476.87384: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.87439: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.88389: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.89326: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 27712 1727096476.89374: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 27712 1727096476.89403: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 27712 1727096476.89448: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a61016a0> <<< 27712 1727096476.89583: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a61024b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a62fd670> <<< 27712 1727096476.89658: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 27712 1727096476.89692: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.89724: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 27712 1727096476.90012: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.90201: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 27712 1727096476.90229: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6102270> <<< 27712 1727096476.90252: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.90996: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.91778: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.91917: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.91990: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 27712 1727096476.92032: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.92091: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 27712 1727096476.92314: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # <<< 27712 1727096476.92364: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27712 1727096476.92380: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 27712 1727096476.92426: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.92493: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 27712 1727096476.92497: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.92865: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.93234: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 27712 1727096476.93335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 27712 1727096476.93350: stdout chunk (state=3): >>>import '_ast' # <<< 27712 1727096476.93444: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6103620> # zipimport: zlib available <<< 27712 1727096476.93557: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.93692: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 27712 1727096476.93700: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 27712 1727096476.93720: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.93774: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.93823: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 27712 1727096476.93938: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27712 1727096476.94027: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.94126: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 27712 1727096476.94183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 27712 1727096476.94323: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a610e090> <<< 27712 1727096476.94359: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6108fb0> <<< 27712 1727096476.94398: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 27712 1727096476.94503: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 27712 1727096476.94594: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.94629: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.94694: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 27712 1727096476.94725: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 27712 1727096476.94775: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 27712 1727096476.94793: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 27712 1727096476.94888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 27712 1727096476.94926: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 27712 1727096476.94929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 27712 1727096476.95021: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a63faa20> <<< 27712 1727096476.95073: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a63ee6f0> <<< 27712 1727096476.95193: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a610e150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6103020> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 27712 1727096476.95241: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.95338: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 27712 1727096476.95382: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 27712 1727096476.95398: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.95604: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.95900: stdout chunk (state=3): >>># zipimport: zlib available <<< 27712 1727096476.96099: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 27712 1727096476.96115: stdout chunk (state=3): >>># destroy __main__ <<< 27712 1727096476.96650: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 <<< 27712 1727096476.96728: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections <<< 27712 1727096476.96831: stdout chunk (state=3): >>># cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six <<< 27712 1727096476.96854: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 27712 1727096476.97129: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery <<< 27712 1727096476.97191: stdout chunk (state=3): >>># destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 27712 1727096476.97195: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 27712 1727096476.97220: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 27712 1727096476.97272: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport <<< 27712 1727096476.97283: stdout chunk (state=3): >>># destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 27712 1727096476.97334: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno <<< 27712 1727096476.97349: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 27712 1727096476.97366: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 27712 1727096476.97431: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal <<< 27712 1727096476.97476: stdout chunk (state=3): >>># cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect <<< 27712 1727096476.97500: stdout chunk (state=3): >>># cleanup[3] wiping math # cleanup[3] wiping warnings <<< 27712 1727096476.97530: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 27712 1727096476.97560: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 27712 1727096476.97730: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket <<< 27712 1727096476.97911: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 27712 1727096476.97959: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 27712 1727096476.98001: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 27712 1727096476.98047: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 27712 1727096476.98052: stdout chunk (state=3): >>># destroy itertools <<< 27712 1727096476.98082: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 27712 1727096476.98564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096476.98651: stderr chunk (state=3): >>>Shared connection to 10.31.15.126 closed. <<< 27712 1727096476.98655: stdout chunk (state=3): >>><<< 27712 1727096476.98657: stderr chunk (state=3): >>><<< 27712 1727096476.98773: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6d084d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6cd7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6d0aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6abd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6abdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6afbe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6afbf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b33830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b33ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b13b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b11280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6af9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b537d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b523f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b12150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b50c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b88860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6af82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6b88d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b88bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6b88f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6af6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b89610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b892e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b8a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6ba0710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6ba1df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6ba2c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6ba32f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6ba21e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6ba3d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6ba34a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b8a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a691fbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a69486e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6948440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6948710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6949040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a69499a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a69488f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a691dd90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a694adb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6949af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6b8ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6977110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a69974a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a69f8260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a69fa9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a69f8380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a69c1280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6329340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a69962a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a694bce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa9a63295b0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_p2fn0sj4/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a637f0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a635dfa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a635d160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a637cf80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a63a69c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a63a6750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a63a6060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a63a67e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a637fd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a63a7740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a63a7980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a63a7ec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6211ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a62138c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a62142c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6215460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6217f20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a69482f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a62161e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a621fce0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a621e7b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a621e510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6215010> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a62166f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6267f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a62680e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6269bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6269970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a626c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a626a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a626f830> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a626c200> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a62705f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6270830> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6270b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a62682c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a62fc1d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a62fd400> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6272990> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a6273d10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a62725d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a61016a0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a61024b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a62fd670> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6102270> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6103620> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa9a610e090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6108fb0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a63faa20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a63ee6f0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a610e150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa9a6103020> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 27712 1727096477.00031: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096476.2233047-27849-195992427422335/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096477.00036: _low_level_execute_command(): starting 27712 1727096477.00038: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096476.2233047-27849-195992427422335/ > /dev/null 2>&1 && sleep 0' 27712 1727096477.00117: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096477.00154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096477.00267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096477.00290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096477.00387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096477.00484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096477.03322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096477.03327: stdout chunk (state=3): >>><<< 27712 1727096477.03329: stderr chunk (state=3): >>><<< 27712 1727096477.03425: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 27712 1727096477.03429: handler run complete 27712 1727096477.03432: attempt loop complete, returning result 27712 1727096477.03434: _execute() done 27712 1727096477.03436: dumping result to json 27712 1727096477.03439: done dumping result, returning 27712 1727096477.03441: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0afff68d-5257-cbc7-8716-0000000000d2] 27712 1727096477.03443: sending task result for task 0afff68d-5257-cbc7-8716-0000000000d2 ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 27712 1727096477.03844: no more pending results, returning what we have 27712 1727096477.03848: results queue empty 27712 1727096477.03849: checking for any_errors_fatal 27712 1727096477.03854: done checking for any_errors_fatal 27712 1727096477.03855: checking for max_fail_percentage 27712 1727096477.03857: done checking for max_fail_percentage 27712 1727096477.03858: checking to see if all hosts have failed and the running result is not ok 27712 1727096477.03859: done checking to see if all hosts have failed 27712 1727096477.03859: getting the remaining hosts for this loop 27712 1727096477.03861: done getting the remaining hosts for this loop 27712 1727096477.03864: getting the next task for host managed_node2 27712 1727096477.03875: done getting next task for host managed_node2 27712 1727096477.03877: ^ task is: TASK: Set flag to indicate system is ostree 27712 1727096477.03880: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096477.03884: getting variables 27712 1727096477.03886: in VariableManager get_vars() 27712 1727096477.03916: Calling all_inventory to load vars for managed_node2 27712 1727096477.03920: Calling groups_inventory to load vars for managed_node2 27712 1727096477.03923: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.03934: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.03937: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.03940: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.04625: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000d2 27712 1727096477.04628: WORKER PROCESS EXITING 27712 1727096477.04673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.05063: done with get_vars() 27712 1727096477.05080: done getting variables 27712 1727096477.05189: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Monday 23 September 2024 09:01:17 -0400 (0:00:00.927) 0:00:02.746 ****** 27712 1727096477.05291: entering _queue_task() for managed_node2/set_fact 27712 1727096477.05293: Creating lock for set_fact 27712 1727096477.05942: worker is 1 (out of 1 available) 27712 1727096477.05956: exiting _queue_task() for managed_node2/set_fact 27712 1727096477.05992: done queuing things up, now waiting for results queue to drain 27712 1727096477.05994: waiting for pending results... 27712 1727096477.06187: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 27712 1727096477.06312: in run() - task 0afff68d-5257-cbc7-8716-0000000000d3 27712 1727096477.06335: variable 'ansible_search_path' from source: unknown 27712 1727096477.06344: variable 'ansible_search_path' from source: unknown 27712 1727096477.06398: calling self._execute() 27712 1727096477.06497: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.06511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.06591: variable 'omit' from source: magic vars 27712 1727096477.07091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096477.07436: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096477.07501: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096477.07539: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096477.07588: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096477.07691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096477.07774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096477.07787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096477.07801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096477.07946: Evaluated conditional (not __network_is_ostree is defined): True 27712 1727096477.07956: variable 'omit' from source: magic vars 27712 1727096477.08007: variable 'omit' from source: magic vars 27712 1727096477.08136: variable '__ostree_booted_stat' from source: set_fact 27712 1727096477.08197: variable 'omit' from source: magic vars 27712 1727096477.08274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096477.08277: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096477.08291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096477.08310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096477.08327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096477.08362: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096477.08372: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.08384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.08490: Set connection var ansible_connection to ssh 27712 1727096477.08510: Set connection var ansible_pipelining to False 27712 1727096477.08544: Set connection var ansible_timeout to 10 27712 1727096477.08549: Set connection var ansible_shell_type to sh 27712 1727096477.08554: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096477.08556: Set connection var ansible_shell_executable to /bin/sh 27712 1727096477.08578: variable 'ansible_shell_executable' from source: unknown 27712 1727096477.08597: variable 'ansible_connection' from source: unknown 27712 1727096477.08600: variable 'ansible_module_compression' from source: unknown 27712 1727096477.08601: variable 'ansible_shell_type' from source: unknown 27712 1727096477.08607: variable 'ansible_shell_executable' from source: unknown 27712 1727096477.08653: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.08656: variable 'ansible_pipelining' from source: unknown 27712 1727096477.08660: variable 'ansible_timeout' from source: unknown 27712 1727096477.08664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.08746: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096477.08764: variable 'omit' from source: magic vars 27712 1727096477.08779: starting attempt loop 27712 1727096477.08786: running the handler 27712 1727096477.08799: handler run complete 27712 1727096477.08816: attempt loop complete, returning result 27712 1727096477.08826: _execute() done 27712 1727096477.08871: dumping result to json 27712 1727096477.08874: done dumping result, returning 27712 1727096477.08876: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0afff68d-5257-cbc7-8716-0000000000d3] 27712 1727096477.08881: sending task result for task 0afff68d-5257-cbc7-8716-0000000000d3 ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 27712 1727096477.09053: no more pending results, returning what we have 27712 1727096477.09056: results queue empty 27712 1727096477.09057: checking for any_errors_fatal 27712 1727096477.09062: done checking for any_errors_fatal 27712 1727096477.09063: checking for max_fail_percentage 27712 1727096477.09065: done checking for max_fail_percentage 27712 1727096477.09065: checking to see if all hosts have failed and the running result is not ok 27712 1727096477.09066: done checking to see if all hosts have failed 27712 1727096477.09069: getting the remaining hosts for this loop 27712 1727096477.09070: done getting the remaining hosts for this loop 27712 1727096477.09074: getting the next task for host managed_node2 27712 1727096477.09179: done getting next task for host managed_node2 27712 1727096477.09182: ^ task is: TASK: Fix CentOS6 Base repo 27712 1727096477.09194: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096477.09199: getting variables 27712 1727096477.09201: in VariableManager get_vars() 27712 1727096477.09232: Calling all_inventory to load vars for managed_node2 27712 1727096477.09235: Calling groups_inventory to load vars for managed_node2 27712 1727096477.09239: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.09311: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.09315: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.09318: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.09625: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000d3 27712 1727096477.09634: WORKER PROCESS EXITING 27712 1727096477.09659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.09863: done with get_vars() 27712 1727096477.09880: done getting variables 27712 1727096477.10005: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Monday 23 September 2024 09:01:17 -0400 (0:00:00.047) 0:00:02.793 ****** 27712 1727096477.10039: entering _queue_task() for managed_node2/copy 27712 1727096477.10508: worker is 1 (out of 1 available) 27712 1727096477.10517: exiting _queue_task() for managed_node2/copy 27712 1727096477.10527: done queuing things up, now waiting for results queue to drain 27712 1727096477.10528: waiting for pending results... 27712 1727096477.10617: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 27712 1727096477.10732: in run() - task 0afff68d-5257-cbc7-8716-0000000000d5 27712 1727096477.10754: variable 'ansible_search_path' from source: unknown 27712 1727096477.10764: variable 'ansible_search_path' from source: unknown 27712 1727096477.10806: calling self._execute() 27712 1727096477.10895: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.10908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.10926: variable 'omit' from source: magic vars 27712 1727096477.11306: variable 'ansible_distribution' from source: facts 27712 1727096477.11322: Evaluated conditional (ansible_distribution == 'CentOS'): True 27712 1727096477.11408: variable 'ansible_distribution_major_version' from source: facts 27712 1727096477.11412: Evaluated conditional (ansible_distribution_major_version == '6'): False 27712 1727096477.11417: when evaluation is False, skipping this task 27712 1727096477.11420: _execute() done 27712 1727096477.11422: dumping result to json 27712 1727096477.11426: done dumping result, returning 27712 1727096477.11433: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0afff68d-5257-cbc7-8716-0000000000d5] 27712 1727096477.11437: sending task result for task 0afff68d-5257-cbc7-8716-0000000000d5 27712 1727096477.11526: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000d5 27712 1727096477.11528: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 27712 1727096477.11575: no more pending results, returning what we have 27712 1727096477.11578: results queue empty 27712 1727096477.11579: checking for any_errors_fatal 27712 1727096477.11582: done checking for any_errors_fatal 27712 1727096477.11583: checking for max_fail_percentage 27712 1727096477.11584: done checking for max_fail_percentage 27712 1727096477.11585: checking to see if all hosts have failed and the running result is not ok 27712 1727096477.11586: done checking to see if all hosts have failed 27712 1727096477.11586: getting the remaining hosts for this loop 27712 1727096477.11587: done getting the remaining hosts for this loop 27712 1727096477.11591: getting the next task for host managed_node2 27712 1727096477.11597: done getting next task for host managed_node2 27712 1727096477.11600: ^ task is: TASK: Include the task 'enable_epel.yml' 27712 1727096477.11603: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096477.11607: getting variables 27712 1727096477.11608: in VariableManager get_vars() 27712 1727096477.11636: Calling all_inventory to load vars for managed_node2 27712 1727096477.11639: Calling groups_inventory to load vars for managed_node2 27712 1727096477.11642: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.11651: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.11653: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.11656: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.11792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.11909: done with get_vars() 27712 1727096477.11916: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Monday 23 September 2024 09:01:17 -0400 (0:00:00.019) 0:00:02.813 ****** 27712 1727096477.11983: entering _queue_task() for managed_node2/include_tasks 27712 1727096477.12180: worker is 1 (out of 1 available) 27712 1727096477.12192: exiting _queue_task() for managed_node2/include_tasks 27712 1727096477.12204: done queuing things up, now waiting for results queue to drain 27712 1727096477.12205: waiting for pending results... 27712 1727096477.12346: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 27712 1727096477.12403: in run() - task 0afff68d-5257-cbc7-8716-0000000000d6 27712 1727096477.12412: variable 'ansible_search_path' from source: unknown 27712 1727096477.12415: variable 'ansible_search_path' from source: unknown 27712 1727096477.12446: calling self._execute() 27712 1727096477.12503: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.12507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.12515: variable 'omit' from source: magic vars 27712 1727096477.13177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096477.14687: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096477.14742: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096477.14770: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096477.14798: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096477.14818: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096477.14880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096477.14901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096477.14919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096477.14949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096477.14963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096477.15059: variable '__network_is_ostree' from source: set_fact 27712 1727096477.15083: Evaluated conditional (not __network_is_ostree | d(false)): True 27712 1727096477.15089: _execute() done 27712 1727096477.15091: dumping result to json 27712 1727096477.15096: done dumping result, returning 27712 1727096477.15102: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0afff68d-5257-cbc7-8716-0000000000d6] 27712 1727096477.15107: sending task result for task 0afff68d-5257-cbc7-8716-0000000000d6 27712 1727096477.15198: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000d6 27712 1727096477.15202: WORKER PROCESS EXITING 27712 1727096477.15229: no more pending results, returning what we have 27712 1727096477.15234: in VariableManager get_vars() 27712 1727096477.15270: Calling all_inventory to load vars for managed_node2 27712 1727096477.15275: Calling groups_inventory to load vars for managed_node2 27712 1727096477.15279: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.15289: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.15304: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.15307: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.15526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.15637: done with get_vars() 27712 1727096477.15643: variable 'ansible_search_path' from source: unknown 27712 1727096477.15644: variable 'ansible_search_path' from source: unknown 27712 1727096477.15672: we have included files to process 27712 1727096477.15673: generating all_blocks data 27712 1727096477.15674: done generating all_blocks data 27712 1727096477.15678: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 27712 1727096477.15679: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 27712 1727096477.15680: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 27712 1727096477.16300: done processing included file 27712 1727096477.16302: iterating over new_blocks loaded from include file 27712 1727096477.16304: in VariableManager get_vars() 27712 1727096477.16316: done with get_vars() 27712 1727096477.16317: filtering new block on tags 27712 1727096477.16340: done filtering new block on tags 27712 1727096477.16343: in VariableManager get_vars() 27712 1727096477.16355: done with get_vars() 27712 1727096477.16356: filtering new block on tags 27712 1727096477.16369: done filtering new block on tags 27712 1727096477.16371: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 27712 1727096477.16377: extending task lists for all hosts with included blocks 27712 1727096477.16499: done extending task lists 27712 1727096477.16501: done processing included files 27712 1727096477.16502: results queue empty 27712 1727096477.16502: checking for any_errors_fatal 27712 1727096477.16505: done checking for any_errors_fatal 27712 1727096477.16506: checking for max_fail_percentage 27712 1727096477.16507: done checking for max_fail_percentage 27712 1727096477.16508: checking to see if all hosts have failed and the running result is not ok 27712 1727096477.16509: done checking to see if all hosts have failed 27712 1727096477.16510: getting the remaining hosts for this loop 27712 1727096477.16511: done getting the remaining hosts for this loop 27712 1727096477.16513: getting the next task for host managed_node2 27712 1727096477.16517: done getting next task for host managed_node2 27712 1727096477.16519: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 27712 1727096477.16522: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096477.16524: getting variables 27712 1727096477.16525: in VariableManager get_vars() 27712 1727096477.16533: Calling all_inventory to load vars for managed_node2 27712 1727096477.16535: Calling groups_inventory to load vars for managed_node2 27712 1727096477.16537: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.16542: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.16550: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.16553: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.16689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.16876: done with get_vars() 27712 1727096477.16884: done getting variables 27712 1727096477.16949: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 27712 1727096477.17149: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Monday 23 September 2024 09:01:17 -0400 (0:00:00.052) 0:00:02.865 ****** 27712 1727096477.17195: entering _queue_task() for managed_node2/command 27712 1727096477.17197: Creating lock for command 27712 1727096477.17520: worker is 1 (out of 1 available) 27712 1727096477.17534: exiting _queue_task() for managed_node2/command 27712 1727096477.17545: done queuing things up, now waiting for results queue to drain 27712 1727096477.17546: waiting for pending results... 27712 1727096477.17985: running TaskExecutor() for managed_node2/TASK: Create EPEL 10 27712 1727096477.17989: in run() - task 0afff68d-5257-cbc7-8716-0000000000f0 27712 1727096477.17992: variable 'ansible_search_path' from source: unknown 27712 1727096477.17994: variable 'ansible_search_path' from source: unknown 27712 1727096477.17997: calling self._execute() 27712 1727096477.18041: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.18052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.18065: variable 'omit' from source: magic vars 27712 1727096477.18428: variable 'ansible_distribution' from source: facts 27712 1727096477.18449: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27712 1727096477.18583: variable 'ansible_distribution_major_version' from source: facts 27712 1727096477.18594: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 27712 1727096477.18602: when evaluation is False, skipping this task 27712 1727096477.18609: _execute() done 27712 1727096477.18615: dumping result to json 27712 1727096477.18622: done dumping result, returning 27712 1727096477.18633: done running TaskExecutor() for managed_node2/TASK: Create EPEL 10 [0afff68d-5257-cbc7-8716-0000000000f0] 27712 1727096477.18642: sending task result for task 0afff68d-5257-cbc7-8716-0000000000f0 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 27712 1727096477.18809: no more pending results, returning what we have 27712 1727096477.18813: results queue empty 27712 1727096477.18814: checking for any_errors_fatal 27712 1727096477.18815: done checking for any_errors_fatal 27712 1727096477.18816: checking for max_fail_percentage 27712 1727096477.18818: done checking for max_fail_percentage 27712 1727096477.18818: checking to see if all hosts have failed and the running result is not ok 27712 1727096477.18819: done checking to see if all hosts have failed 27712 1727096477.18820: getting the remaining hosts for this loop 27712 1727096477.18821: done getting the remaining hosts for this loop 27712 1727096477.18825: getting the next task for host managed_node2 27712 1727096477.18831: done getting next task for host managed_node2 27712 1727096477.18834: ^ task is: TASK: Install yum-utils package 27712 1727096477.18838: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096477.18842: getting variables 27712 1727096477.18844: in VariableManager get_vars() 27712 1727096477.18877: Calling all_inventory to load vars for managed_node2 27712 1727096477.18880: Calling groups_inventory to load vars for managed_node2 27712 1727096477.18884: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.18897: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.18900: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.18903: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.19222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.19584: done with get_vars() 27712 1727096477.19593: done getting variables 27712 1727096477.19623: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000f0 27712 1727096477.19626: WORKER PROCESS EXITING 27712 1727096477.19700: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Monday 23 September 2024 09:01:17 -0400 (0:00:00.025) 0:00:02.890 ****** 27712 1727096477.19727: entering _queue_task() for managed_node2/package 27712 1727096477.19728: Creating lock for package 27712 1727096477.19991: worker is 1 (out of 1 available) 27712 1727096477.20002: exiting _queue_task() for managed_node2/package 27712 1727096477.20012: done queuing things up, now waiting for results queue to drain 27712 1727096477.20013: waiting for pending results... 27712 1727096477.20245: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 27712 1727096477.20348: in run() - task 0afff68d-5257-cbc7-8716-0000000000f1 27712 1727096477.20373: variable 'ansible_search_path' from source: unknown 27712 1727096477.20381: variable 'ansible_search_path' from source: unknown 27712 1727096477.20418: calling self._execute() 27712 1727096477.20498: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.20511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.20523: variable 'omit' from source: magic vars 27712 1727096477.20890: variable 'ansible_distribution' from source: facts 27712 1727096477.20912: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27712 1727096477.21045: variable 'ansible_distribution_major_version' from source: facts 27712 1727096477.21057: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 27712 1727096477.21065: when evaluation is False, skipping this task 27712 1727096477.21074: _execute() done 27712 1727096477.21081: dumping result to json 27712 1727096477.21088: done dumping result, returning 27712 1727096477.21099: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0afff68d-5257-cbc7-8716-0000000000f1] 27712 1727096477.21109: sending task result for task 0afff68d-5257-cbc7-8716-0000000000f1 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 27712 1727096477.21407: no more pending results, returning what we have 27712 1727096477.21411: results queue empty 27712 1727096477.21412: checking for any_errors_fatal 27712 1727096477.21417: done checking for any_errors_fatal 27712 1727096477.21417: checking for max_fail_percentage 27712 1727096477.21419: done checking for max_fail_percentage 27712 1727096477.21419: checking to see if all hosts have failed and the running result is not ok 27712 1727096477.21420: done checking to see if all hosts have failed 27712 1727096477.21421: getting the remaining hosts for this loop 27712 1727096477.21422: done getting the remaining hosts for this loop 27712 1727096477.21426: getting the next task for host managed_node2 27712 1727096477.21431: done getting next task for host managed_node2 27712 1727096477.21434: ^ task is: TASK: Enable EPEL 7 27712 1727096477.21438: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096477.21442: getting variables 27712 1727096477.21443: in VariableManager get_vars() 27712 1727096477.21473: Calling all_inventory to load vars for managed_node2 27712 1727096477.21476: Calling groups_inventory to load vars for managed_node2 27712 1727096477.21480: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.21490: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.21494: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.21497: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.21733: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000f1 27712 1727096477.21737: WORKER PROCESS EXITING 27712 1727096477.21759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.21950: done with get_vars() 27712 1727096477.21960: done getting variables 27712 1727096477.22018: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Monday 23 September 2024 09:01:17 -0400 (0:00:00.023) 0:00:02.913 ****** 27712 1727096477.22045: entering _queue_task() for managed_node2/command 27712 1727096477.22298: worker is 1 (out of 1 available) 27712 1727096477.22309: exiting _queue_task() for managed_node2/command 27712 1727096477.22320: done queuing things up, now waiting for results queue to drain 27712 1727096477.22321: waiting for pending results... 27712 1727096477.22556: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 27712 1727096477.22664: in run() - task 0afff68d-5257-cbc7-8716-0000000000f2 27712 1727096477.22688: variable 'ansible_search_path' from source: unknown 27712 1727096477.22695: variable 'ansible_search_path' from source: unknown 27712 1727096477.22732: calling self._execute() 27712 1727096477.22809: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.22820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.22833: variable 'omit' from source: magic vars 27712 1727096477.23243: variable 'ansible_distribution' from source: facts 27712 1727096477.23261: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27712 1727096477.23387: variable 'ansible_distribution_major_version' from source: facts 27712 1727096477.23398: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 27712 1727096477.23406: when evaluation is False, skipping this task 27712 1727096477.23412: _execute() done 27712 1727096477.23419: dumping result to json 27712 1727096477.23431: done dumping result, returning 27712 1727096477.23441: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0afff68d-5257-cbc7-8716-0000000000f2] 27712 1727096477.23451: sending task result for task 0afff68d-5257-cbc7-8716-0000000000f2 27712 1727096477.23673: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000f2 27712 1727096477.23677: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 27712 1727096477.23722: no more pending results, returning what we have 27712 1727096477.23725: results queue empty 27712 1727096477.23726: checking for any_errors_fatal 27712 1727096477.23731: done checking for any_errors_fatal 27712 1727096477.23732: checking for max_fail_percentage 27712 1727096477.23734: done checking for max_fail_percentage 27712 1727096477.23734: checking to see if all hosts have failed and the running result is not ok 27712 1727096477.23735: done checking to see if all hosts have failed 27712 1727096477.23736: getting the remaining hosts for this loop 27712 1727096477.23737: done getting the remaining hosts for this loop 27712 1727096477.23741: getting the next task for host managed_node2 27712 1727096477.23747: done getting next task for host managed_node2 27712 1727096477.23749: ^ task is: TASK: Enable EPEL 8 27712 1727096477.23753: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096477.23757: getting variables 27712 1727096477.23759: in VariableManager get_vars() 27712 1727096477.23791: Calling all_inventory to load vars for managed_node2 27712 1727096477.23794: Calling groups_inventory to load vars for managed_node2 27712 1727096477.23797: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.23809: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.23812: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.23814: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.24133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.24322: done with get_vars() 27712 1727096477.24331: done getting variables 27712 1727096477.24388: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Monday 23 September 2024 09:01:17 -0400 (0:00:00.023) 0:00:02.937 ****** 27712 1727096477.24417: entering _queue_task() for managed_node2/command 27712 1727096477.24660: worker is 1 (out of 1 available) 27712 1727096477.24674: exiting _queue_task() for managed_node2/command 27712 1727096477.24685: done queuing things up, now waiting for results queue to drain 27712 1727096477.24687: waiting for pending results... 27712 1727096477.24913: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 27712 1727096477.25027: in run() - task 0afff68d-5257-cbc7-8716-0000000000f3 27712 1727096477.25050: variable 'ansible_search_path' from source: unknown 27712 1727096477.25059: variable 'ansible_search_path' from source: unknown 27712 1727096477.25101: calling self._execute() 27712 1727096477.25181: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.25194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.25209: variable 'omit' from source: magic vars 27712 1727096477.25673: variable 'ansible_distribution' from source: facts 27712 1727096477.25678: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27712 1727096477.25722: variable 'ansible_distribution_major_version' from source: facts 27712 1727096477.25733: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 27712 1727096477.25741: when evaluation is False, skipping this task 27712 1727096477.25748: _execute() done 27712 1727096477.25756: dumping result to json 27712 1727096477.25764: done dumping result, returning 27712 1727096477.25777: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0afff68d-5257-cbc7-8716-0000000000f3] 27712 1727096477.25788: sending task result for task 0afff68d-5257-cbc7-8716-0000000000f3 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 27712 1727096477.26015: no more pending results, returning what we have 27712 1727096477.26019: results queue empty 27712 1727096477.26020: checking for any_errors_fatal 27712 1727096477.26025: done checking for any_errors_fatal 27712 1727096477.26026: checking for max_fail_percentage 27712 1727096477.26028: done checking for max_fail_percentage 27712 1727096477.26029: checking to see if all hosts have failed and the running result is not ok 27712 1727096477.26030: done checking to see if all hosts have failed 27712 1727096477.26030: getting the remaining hosts for this loop 27712 1727096477.26031: done getting the remaining hosts for this loop 27712 1727096477.26035: getting the next task for host managed_node2 27712 1727096477.26045: done getting next task for host managed_node2 27712 1727096477.26047: ^ task is: TASK: Enable EPEL 6 27712 1727096477.26052: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096477.26055: getting variables 27712 1727096477.26057: in VariableManager get_vars() 27712 1727096477.26099: Calling all_inventory to load vars for managed_node2 27712 1727096477.26102: Calling groups_inventory to load vars for managed_node2 27712 1727096477.26109: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.26122: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.26125: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.26128: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.26464: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000f3 27712 1727096477.26469: WORKER PROCESS EXITING 27712 1727096477.26493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.26710: done with get_vars() 27712 1727096477.26720: done getting variables 27712 1727096477.26793: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Monday 23 September 2024 09:01:17 -0400 (0:00:00.024) 0:00:02.961 ****** 27712 1727096477.26828: entering _queue_task() for managed_node2/copy 27712 1727096477.27242: worker is 1 (out of 1 available) 27712 1727096477.27255: exiting _queue_task() for managed_node2/copy 27712 1727096477.27483: done queuing things up, now waiting for results queue to drain 27712 1727096477.27486: waiting for pending results... 27712 1727096477.27885: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 27712 1727096477.28240: in run() - task 0afff68d-5257-cbc7-8716-0000000000f5 27712 1727096477.28259: variable 'ansible_search_path' from source: unknown 27712 1727096477.28284: variable 'ansible_search_path' from source: unknown 27712 1727096477.28456: calling self._execute() 27712 1727096477.28509: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.28553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.28677: variable 'omit' from source: magic vars 27712 1727096477.29002: variable 'ansible_distribution' from source: facts 27712 1727096477.29021: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 27712 1727096477.29190: variable 'ansible_distribution_major_version' from source: facts 27712 1727096477.29200: Evaluated conditional (ansible_distribution_major_version == '6'): False 27712 1727096477.29211: when evaluation is False, skipping this task 27712 1727096477.29217: _execute() done 27712 1727096477.29222: dumping result to json 27712 1727096477.29229: done dumping result, returning 27712 1727096477.29237: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0afff68d-5257-cbc7-8716-0000000000f5] 27712 1727096477.29247: sending task result for task 0afff68d-5257-cbc7-8716-0000000000f5 27712 1727096477.29514: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000f5 27712 1727096477.29518: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 27712 1727096477.29581: no more pending results, returning what we have 27712 1727096477.29584: results queue empty 27712 1727096477.29585: checking for any_errors_fatal 27712 1727096477.29592: done checking for any_errors_fatal 27712 1727096477.29593: checking for max_fail_percentage 27712 1727096477.29595: done checking for max_fail_percentage 27712 1727096477.29596: checking to see if all hosts have failed and the running result is not ok 27712 1727096477.29596: done checking to see if all hosts have failed 27712 1727096477.29597: getting the remaining hosts for this loop 27712 1727096477.29599: done getting the remaining hosts for this loop 27712 1727096477.29602: getting the next task for host managed_node2 27712 1727096477.29610: done getting next task for host managed_node2 27712 1727096477.29613: ^ task is: TASK: Set network provider to 'nm' 27712 1727096477.29615: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096477.29620: getting variables 27712 1727096477.29621: in VariableManager get_vars() 27712 1727096477.29652: Calling all_inventory to load vars for managed_node2 27712 1727096477.29654: Calling groups_inventory to load vars for managed_node2 27712 1727096477.29658: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.29671: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.29674: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.29678: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.30357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.30732: done with get_vars() 27712 1727096477.30743: done getting variables 27712 1727096477.30802: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:13 Monday 23 September 2024 09:01:17 -0400 (0:00:00.040) 0:00:03.001 ****** 27712 1727096477.30831: entering _queue_task() for managed_node2/set_fact 27712 1727096477.31604: worker is 1 (out of 1 available) 27712 1727096477.31615: exiting _queue_task() for managed_node2/set_fact 27712 1727096477.31627: done queuing things up, now waiting for results queue to drain 27712 1727096477.31629: waiting for pending results... 27712 1727096477.31881: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 27712 1727096477.31976: in run() - task 0afff68d-5257-cbc7-8716-000000000007 27712 1727096477.32076: variable 'ansible_search_path' from source: unknown 27712 1727096477.32080: calling self._execute() 27712 1727096477.32123: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.32134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.32147: variable 'omit' from source: magic vars 27712 1727096477.32260: variable 'omit' from source: magic vars 27712 1727096477.32296: variable 'omit' from source: magic vars 27712 1727096477.32357: variable 'omit' from source: magic vars 27712 1727096477.32453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096477.32540: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096477.32582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096477.32614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096477.32669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096477.32718: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096477.32779: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.32782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.32987: Set connection var ansible_connection to ssh 27712 1727096477.32991: Set connection var ansible_pipelining to False 27712 1727096477.32993: Set connection var ansible_timeout to 10 27712 1727096477.32996: Set connection var ansible_shell_type to sh 27712 1727096477.32998: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096477.33000: Set connection var ansible_shell_executable to /bin/sh 27712 1727096477.33002: variable 'ansible_shell_executable' from source: unknown 27712 1727096477.33004: variable 'ansible_connection' from source: unknown 27712 1727096477.33007: variable 'ansible_module_compression' from source: unknown 27712 1727096477.33008: variable 'ansible_shell_type' from source: unknown 27712 1727096477.33010: variable 'ansible_shell_executable' from source: unknown 27712 1727096477.33013: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.33015: variable 'ansible_pipelining' from source: unknown 27712 1727096477.33020: variable 'ansible_timeout' from source: unknown 27712 1727096477.33022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.33475: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096477.33558: variable 'omit' from source: magic vars 27712 1727096477.33583: starting attempt loop 27712 1727096477.33592: running the handler 27712 1727096477.33609: handler run complete 27712 1727096477.33685: attempt loop complete, returning result 27712 1727096477.33694: _execute() done 27712 1727096477.33708: dumping result to json 27712 1727096477.33723: done dumping result, returning 27712 1727096477.33765: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0afff68d-5257-cbc7-8716-000000000007] 27712 1727096477.33770: sending task result for task 0afff68d-5257-cbc7-8716-000000000007 27712 1727096477.33986: done sending task result for task 0afff68d-5257-cbc7-8716-000000000007 27712 1727096477.33988: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 27712 1727096477.34043: no more pending results, returning what we have 27712 1727096477.34046: results queue empty 27712 1727096477.34046: checking for any_errors_fatal 27712 1727096477.34052: done checking for any_errors_fatal 27712 1727096477.34052: checking for max_fail_percentage 27712 1727096477.34054: done checking for max_fail_percentage 27712 1727096477.34055: checking to see if all hosts have failed and the running result is not ok 27712 1727096477.34055: done checking to see if all hosts have failed 27712 1727096477.34056: getting the remaining hosts for this loop 27712 1727096477.34057: done getting the remaining hosts for this loop 27712 1727096477.34061: getting the next task for host managed_node2 27712 1727096477.34074: done getting next task for host managed_node2 27712 1727096477.34172: ^ task is: TASK: meta (flush_handlers) 27712 1727096477.34174: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096477.34179: getting variables 27712 1727096477.34180: in VariableManager get_vars() 27712 1727096477.34215: Calling all_inventory to load vars for managed_node2 27712 1727096477.34218: Calling groups_inventory to load vars for managed_node2 27712 1727096477.34224: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.34233: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.34236: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.34239: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.34621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.34824: done with get_vars() 27712 1727096477.34834: done getting variables 27712 1727096477.34933: in VariableManager get_vars() 27712 1727096477.34956: Calling all_inventory to load vars for managed_node2 27712 1727096477.34958: Calling groups_inventory to load vars for managed_node2 27712 1727096477.34961: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.34966: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.34978: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.34982: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.35113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.35315: done with get_vars() 27712 1727096477.35329: done queuing things up, now waiting for results queue to drain 27712 1727096477.35330: results queue empty 27712 1727096477.35331: checking for any_errors_fatal 27712 1727096477.35333: done checking for any_errors_fatal 27712 1727096477.35334: checking for max_fail_percentage 27712 1727096477.35335: done checking for max_fail_percentage 27712 1727096477.35336: checking to see if all hosts have failed and the running result is not ok 27712 1727096477.35336: done checking to see if all hosts have failed 27712 1727096477.35337: getting the remaining hosts for this loop 27712 1727096477.35338: done getting the remaining hosts for this loop 27712 1727096477.35341: getting the next task for host managed_node2 27712 1727096477.35345: done getting next task for host managed_node2 27712 1727096477.35346: ^ task is: TASK: meta (flush_handlers) 27712 1727096477.35347: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096477.35355: getting variables 27712 1727096477.35356: in VariableManager get_vars() 27712 1727096477.35363: Calling all_inventory to load vars for managed_node2 27712 1727096477.35365: Calling groups_inventory to load vars for managed_node2 27712 1727096477.35371: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.35375: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.35378: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.35381: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.35512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.35722: done with get_vars() 27712 1727096477.35730: done getting variables 27712 1727096477.35787: in VariableManager get_vars() 27712 1727096477.35795: Calling all_inventory to load vars for managed_node2 27712 1727096477.35797: Calling groups_inventory to load vars for managed_node2 27712 1727096477.35800: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.35804: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.35806: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.35809: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.35971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.36138: done with get_vars() 27712 1727096477.36149: done queuing things up, now waiting for results queue to drain 27712 1727096477.36151: results queue empty 27712 1727096477.36152: checking for any_errors_fatal 27712 1727096477.36153: done checking for any_errors_fatal 27712 1727096477.36154: checking for max_fail_percentage 27712 1727096477.36155: done checking for max_fail_percentage 27712 1727096477.36155: checking to see if all hosts have failed and the running result is not ok 27712 1727096477.36156: done checking to see if all hosts have failed 27712 1727096477.36157: getting the remaining hosts for this loop 27712 1727096477.36158: done getting the remaining hosts for this loop 27712 1727096477.36160: getting the next task for host managed_node2 27712 1727096477.36163: done getting next task for host managed_node2 27712 1727096477.36164: ^ task is: None 27712 1727096477.36166: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096477.36169: done queuing things up, now waiting for results queue to drain 27712 1727096477.36170: results queue empty 27712 1727096477.36171: checking for any_errors_fatal 27712 1727096477.36171: done checking for any_errors_fatal 27712 1727096477.36172: checking for max_fail_percentage 27712 1727096477.36173: done checking for max_fail_percentage 27712 1727096477.36174: checking to see if all hosts have failed and the running result is not ok 27712 1727096477.36174: done checking to see if all hosts have failed 27712 1727096477.36176: getting the next task for host managed_node2 27712 1727096477.36179: done getting next task for host managed_node2 27712 1727096477.36180: ^ task is: None 27712 1727096477.36181: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096477.36228: in VariableManager get_vars() 27712 1727096477.36254: done with get_vars() 27712 1727096477.36260: in VariableManager get_vars() 27712 1727096477.36277: done with get_vars() 27712 1727096477.36282: variable 'omit' from source: magic vars 27712 1727096477.36313: in VariableManager get_vars() 27712 1727096477.36327: done with get_vars() 27712 1727096477.36348: variable 'omit' from source: magic vars PLAY [Test output device of routes] ******************************************** 27712 1727096477.37121: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 27712 1727096477.37219: getting the remaining hosts for this loop 27712 1727096477.37221: done getting the remaining hosts for this loop 27712 1727096477.37224: getting the next task for host managed_node2 27712 1727096477.37226: done getting next task for host managed_node2 27712 1727096477.37233: ^ task is: TASK: Gathering Facts 27712 1727096477.37235: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096477.37237: getting variables 27712 1727096477.37238: in VariableManager get_vars() 27712 1727096477.37293: Calling all_inventory to load vars for managed_node2 27712 1727096477.37296: Calling groups_inventory to load vars for managed_node2 27712 1727096477.37298: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096477.37303: Calling all_plugins_play to load vars for managed_node2 27712 1727096477.37520: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096477.37552: Calling groups_plugins_play to load vars for managed_node2 27712 1727096477.37728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096477.37897: done with get_vars() 27712 1727096477.37905: done getting variables 27712 1727096477.38117: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:3 Monday 23 September 2024 09:01:17 -0400 (0:00:00.073) 0:00:03.074 ****** 27712 1727096477.38141: entering _queue_task() for managed_node2/gather_facts 27712 1727096477.38436: worker is 1 (out of 1 available) 27712 1727096477.38447: exiting _queue_task() for managed_node2/gather_facts 27712 1727096477.38460: done queuing things up, now waiting for results queue to drain 27712 1727096477.38461: waiting for pending results... 27712 1727096477.38702: running TaskExecutor() for managed_node2/TASK: Gathering Facts 27712 1727096477.38804: in run() - task 0afff68d-5257-cbc7-8716-00000000011b 27712 1727096477.38822: variable 'ansible_search_path' from source: unknown 27712 1727096477.38860: calling self._execute() 27712 1727096477.39073: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.39076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.39079: variable 'omit' from source: magic vars 27712 1727096477.39328: variable 'ansible_distribution_major_version' from source: facts 27712 1727096477.39345: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096477.39355: variable 'omit' from source: magic vars 27712 1727096477.39384: variable 'omit' from source: magic vars 27712 1727096477.39427: variable 'omit' from source: magic vars 27712 1727096477.39471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096477.39515: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096477.39543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096477.39566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096477.39585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096477.39634: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096477.39643: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.39650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.39821: Set connection var ansible_connection to ssh 27712 1727096477.39825: Set connection var ansible_pipelining to False 27712 1727096477.39827: Set connection var ansible_timeout to 10 27712 1727096477.39829: Set connection var ansible_shell_type to sh 27712 1727096477.39832: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096477.39834: Set connection var ansible_shell_executable to /bin/sh 27712 1727096477.39938: variable 'ansible_shell_executable' from source: unknown 27712 1727096477.39941: variable 'ansible_connection' from source: unknown 27712 1727096477.39944: variable 'ansible_module_compression' from source: unknown 27712 1727096477.39946: variable 'ansible_shell_type' from source: unknown 27712 1727096477.39948: variable 'ansible_shell_executable' from source: unknown 27712 1727096477.39950: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096477.39952: variable 'ansible_pipelining' from source: unknown 27712 1727096477.39953: variable 'ansible_timeout' from source: unknown 27712 1727096477.40038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096477.40140: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096477.40161: variable 'omit' from source: magic vars 27712 1727096477.40174: starting attempt loop 27712 1727096477.40181: running the handler 27712 1727096477.40201: variable 'ansible_facts' from source: unknown 27712 1727096477.40262: _low_level_execute_command(): starting 27712 1727096477.40265: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096477.40945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096477.40959: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096477.41034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096477.41099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096477.41157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096477.41188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096477.43698: stdout chunk (state=3): >>>/root <<< 27712 1727096477.43733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096477.43736: stdout chunk (state=3): >>><<< 27712 1727096477.43738: stderr chunk (state=3): >>><<< 27712 1727096477.43855: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 27712 1727096477.43859: _low_level_execute_command(): starting 27712 1727096477.43862: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096477.4376147-27902-111846852782727 `" && echo ansible-tmp-1727096477.4376147-27902-111846852782727="` echo /root/.ansible/tmp/ansible-tmp-1727096477.4376147-27902-111846852782727 `" ) && sleep 0' 27712 1727096477.45116: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096477.45120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096477.45122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096477.45124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096477.45144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096477.45151: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096477.45288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096477.45301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096477.45423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096477.45451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096477.45520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096477.48244: stdout chunk (state=3): >>>ansible-tmp-1727096477.4376147-27902-111846852782727=/root/.ansible/tmp/ansible-tmp-1727096477.4376147-27902-111846852782727 <<< 27712 1727096477.48435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096477.48439: stdout chunk (state=3): >>><<< 27712 1727096477.48875: stderr chunk (state=3): >>><<< 27712 1727096477.48879: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096477.4376147-27902-111846852782727=/root/.ansible/tmp/ansible-tmp-1727096477.4376147-27902-111846852782727 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 27712 1727096477.48882: variable 'ansible_module_compression' from source: unknown 27712 1727096477.48884: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 27712 1727096477.48885: variable 'ansible_facts' from source: unknown 27712 1727096477.49385: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096477.4376147-27902-111846852782727/AnsiballZ_setup.py 27712 1727096477.49869: Sending initial data 27712 1727096477.49873: Sent initial data (154 bytes) 27712 1727096477.50955: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096477.51095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096477.51180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096477.53464: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096477.53553: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096477.53629: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpozxzzoup /root/.ansible/tmp/ansible-tmp-1727096477.4376147-27902-111846852782727/AnsiballZ_setup.py <<< 27712 1727096477.53651: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096477.4376147-27902-111846852782727/AnsiballZ_setup.py" <<< 27712 1727096477.53681: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpozxzzoup" to remote "/root/.ansible/tmp/ansible-tmp-1727096477.4376147-27902-111846852782727/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096477.4376147-27902-111846852782727/AnsiballZ_setup.py" <<< 27712 1727096477.55230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096477.55246: stdout chunk (state=3): >>><<< 27712 1727096477.55271: stderr chunk (state=3): >>><<< 27712 1727096477.55297: done transferring module to remote 27712 1727096477.55311: _low_level_execute_command(): starting 27712 1727096477.55319: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096477.4376147-27902-111846852782727/ /root/.ansible/tmp/ansible-tmp-1727096477.4376147-27902-111846852782727/AnsiballZ_setup.py && sleep 0' 27712 1727096477.56008: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096477.56112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096477.56148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096477.56171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096477.56257: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096477.56290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096477.56305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096477.56383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096477.56579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096477.59072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096477.59136: stderr chunk (state=3): >>><<< 27712 1727096477.59156: stdout chunk (state=3): >>><<< 27712 1727096477.59184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 27712 1727096477.59269: _low_level_execute_command(): starting 27712 1727096477.59275: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096477.4376147-27902-111846852782727/AnsiballZ_setup.py && sleep 0' 27712 1727096477.59928: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096477.59955: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096477.60017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096477.60040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096477.60088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096477.60092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096477.60160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096477.60231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096477.60304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096477.60318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096477.60582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096478.39825: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "01", "second": "18", "epoch": "1727096478", "epoch_int": "1727096478", "date": "2024-09-23", "time": "09:01:18", "iso8601_micro": "2024-09-23T13:01:18.016201Z", "iso8601": "2024-09-23T13:01:18Z", "iso8601_basic": "20240923T090118016201", "iso8601_basic_short": "20240923T090118", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.54736328125, "5m": 0.4912109375, "15m": 0.29345703125}, "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ceff:fe61:4d8f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.126"], "ansible_all_ipv6_addresses": ["fe80::8ff:ceff:fe61:4d8f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.126", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ceff:fe61:4d8f"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2951, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 580, "free": 2951}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_uuid": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 620, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795008512, "block_size": 4096, "block_total": 65519099, "block_available": 63914797, "block_used": 1604302, "inode_total": 131070960, "inode_available": 131029098, "inode_used": 41862, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 27712 1727096478.42625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096478.42639: stderr chunk (state=3): >>>Shared connection to 10.31.15.126 closed. <<< 27712 1727096478.42772: stderr chunk (state=3): >>><<< 27712 1727096478.42784: stdout chunk (state=3): >>><<< 27712 1727096478.42885: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "09", "minute": "01", "second": "18", "epoch": "1727096478", "epoch_int": "1727096478", "date": "2024-09-23", "time": "09:01:18", "iso8601_micro": "2024-09-23T13:01:18.016201Z", "iso8601": "2024-09-23T13:01:18Z", "iso8601_basic": "20240923T090118016201", "iso8601_basic_short": "20240923T090118", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-126", "ansible_nodename": "ip-10-31-15-126.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec28dde2945b45c603c07d1816f189ea", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDKegcOhSBBilPJbevAHD4q8M2Mopcwhk5CGV5r5zX7SVKZTcjkW8xnfPyLrun+WW0cQRMXVgP0jFIte4IhEWC+4vcA7Ubhdz4CArTiu0d0CmCP5DrweOGMeiXJAruzQe7p15W7DfmYUttAvwJOnVoGGXIHJ+LeCSjoC8hCBzCBkinO6LdWCxyZJ0Ktd3vrG8rtKXNn6Mcbb/KBQZkIb3Z4FG4DC++e4jGcQGcRFEpSHNwHMfXNsWBHyWTHObEaN/wtzMsrNKvoPkOfnZrX/JzLgfLwZg+6AyfpdYZYO0KUclhofrZ+VMN6lRIJ08BPTU8Ytp/GGVdDbT+CR+/s6ZhfarCNxjaUOeGYKneO18ggxdb122VHaeH6ZtL1MmDlQP+TJDjEo+geHTJ7jspENzBcPqZGqIgTNUWUz6BaYsOfngMlT23D8WFZ0ONY/aME8ehI/7H8ct53v0qli3JiaeASss2Ta0t0TjeAsVmmftFfun4WxCiDEYwZ9qS4slvZfIk=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9J0Wn206/1q3dk5MbgvB+OOvYvoXlD999cW2a247C6inSEimXU7z4+MjUOd20ewjDwiGvOA1TDPvm8yJUuohE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKTXqMHLRVsYJX1CO4X6/wlD0Am2X0KaDd9ZLpNZJmkW", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.54736328125, "5m": 0.4912109375, "15m": 0.29345703125}, "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 34268 10.31.15.126 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 34268 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ceff:fe61:4d8f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.126", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ce:61:4d:8f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.126"], "ansible_all_ipv6_addresses": ["fe80::8ff:ceff:fe61:4d8f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.126", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ceff:fe61:4d8f"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2951, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 580, "free": 2951}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_uuid": "ec28dde2-945b-45c6-03c0-7d1816f189ea", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 620, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795008512, "block_size": 4096, "block_total": 65519099, "block_available": 63914797, "block_used": 1604302, "inode_total": 131070960, "inode_available": 131029098, "inode_used": 41862, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096478.43745: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096477.4376147-27902-111846852782727/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096478.43748: _low_level_execute_command(): starting 27712 1727096478.43750: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096477.4376147-27902-111846852782727/ > /dev/null 2>&1 && sleep 0' 27712 1727096478.44999: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096478.45091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096478.45127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096478.45192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096478.45277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 27712 1727096478.48475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096478.48479: stdout chunk (state=3): >>><<< 27712 1727096478.48481: stderr chunk (state=3): >>><<< 27712 1727096478.48483: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 27712 1727096478.48485: handler run complete 27712 1727096478.48487: variable 'ansible_facts' from source: unknown 27712 1727096478.48648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096478.49405: variable 'ansible_facts' from source: unknown 27712 1727096478.49499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096478.49737: attempt loop complete, returning result 27712 1727096478.49809: _execute() done 27712 1727096478.49860: dumping result to json 27712 1727096478.50174: done dumping result, returning 27712 1727096478.50178: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0afff68d-5257-cbc7-8716-00000000011b] 27712 1727096478.50181: sending task result for task 0afff68d-5257-cbc7-8716-00000000011b 27712 1727096478.50726: done sending task result for task 0afff68d-5257-cbc7-8716-00000000011b 27712 1727096478.50729: WORKER PROCESS EXITING ok: [managed_node2] 27712 1727096478.51485: no more pending results, returning what we have 27712 1727096478.51488: results queue empty 27712 1727096478.51489: checking for any_errors_fatal 27712 1727096478.51490: done checking for any_errors_fatal 27712 1727096478.51491: checking for max_fail_percentage 27712 1727096478.51492: done checking for max_fail_percentage 27712 1727096478.51493: checking to see if all hosts have failed and the running result is not ok 27712 1727096478.51494: done checking to see if all hosts have failed 27712 1727096478.51494: getting the remaining hosts for this loop 27712 1727096478.51496: done getting the remaining hosts for this loop 27712 1727096478.51499: getting the next task for host managed_node2 27712 1727096478.51504: done getting next task for host managed_node2 27712 1727096478.51506: ^ task is: TASK: meta (flush_handlers) 27712 1727096478.51508: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096478.51512: getting variables 27712 1727096478.51513: in VariableManager get_vars() 27712 1727096478.51543: Calling all_inventory to load vars for managed_node2 27712 1727096478.51545: Calling groups_inventory to load vars for managed_node2 27712 1727096478.51548: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096478.51559: Calling all_plugins_play to load vars for managed_node2 27712 1727096478.51561: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096478.51564: Calling groups_plugins_play to load vars for managed_node2 27712 1727096478.52039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096478.52989: done with get_vars() 27712 1727096478.53002: done getting variables 27712 1727096478.53278: in VariableManager get_vars() 27712 1727096478.53295: Calling all_inventory to load vars for managed_node2 27712 1727096478.53297: Calling groups_inventory to load vars for managed_node2 27712 1727096478.53299: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096478.53304: Calling all_plugins_play to load vars for managed_node2 27712 1727096478.53306: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096478.53309: Calling groups_plugins_play to load vars for managed_node2 27712 1727096478.53422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096478.54218: done with get_vars() 27712 1727096478.54233: done queuing things up, now waiting for results queue to drain 27712 1727096478.54235: results queue empty 27712 1727096478.54235: checking for any_errors_fatal 27712 1727096478.54239: done checking for any_errors_fatal 27712 1727096478.54240: checking for max_fail_percentage 27712 1727096478.54241: done checking for max_fail_percentage 27712 1727096478.54242: checking to see if all hosts have failed and the running result is not ok 27712 1727096478.54248: done checking to see if all hosts have failed 27712 1727096478.54249: getting the remaining hosts for this loop 27712 1727096478.54250: done getting the remaining hosts for this loop 27712 1727096478.54253: getting the next task for host managed_node2 27712 1727096478.54257: done getting next task for host managed_node2 27712 1727096478.54259: ^ task is: TASK: Set type and interface0 27712 1727096478.54261: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096478.54263: getting variables 27712 1727096478.54264: in VariableManager get_vars() 27712 1727096478.54441: Calling all_inventory to load vars for managed_node2 27712 1727096478.54444: Calling groups_inventory to load vars for managed_node2 27712 1727096478.54446: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096478.54451: Calling all_plugins_play to load vars for managed_node2 27712 1727096478.54454: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096478.54457: Calling groups_plugins_play to load vars for managed_node2 27712 1727096478.54816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096478.55294: done with get_vars() 27712 1727096478.55419: done getting variables 27712 1727096478.55480: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set type and interface0] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:11 Monday 23 September 2024 09:01:18 -0400 (0:00:01.173) 0:00:04.248 ****** 27712 1727096478.55508: entering _queue_task() for managed_node2/set_fact 27712 1727096478.56472: worker is 1 (out of 1 available) 27712 1727096478.56484: exiting _queue_task() for managed_node2/set_fact 27712 1727096478.56496: done queuing things up, now waiting for results queue to drain 27712 1727096478.56497: waiting for pending results... 27712 1727096478.57172: running TaskExecutor() for managed_node2/TASK: Set type and interface0 27712 1727096478.57474: in run() - task 0afff68d-5257-cbc7-8716-00000000000b 27712 1727096478.57479: variable 'ansible_search_path' from source: unknown 27712 1727096478.57482: calling self._execute() 27712 1727096478.57874: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096478.57877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096478.57880: variable 'omit' from source: magic vars 27712 1727096478.58533: variable 'ansible_distribution_major_version' from source: facts 27712 1727096478.58552: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096478.58564: variable 'omit' from source: magic vars 27712 1727096478.58600: variable 'omit' from source: magic vars 27712 1727096478.58641: variable 'type' from source: play vars 27712 1727096478.58953: variable 'type' from source: play vars 27712 1727096478.59010: variable 'interface0' from source: play vars 27712 1727096478.59191: variable 'interface0' from source: play vars 27712 1727096478.59211: variable 'omit' from source: magic vars 27712 1727096478.59252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096478.59674: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096478.59678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096478.59681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096478.59683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096478.59685: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096478.59688: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096478.59689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096478.59722: Set connection var ansible_connection to ssh 27712 1727096478.60172: Set connection var ansible_pipelining to False 27712 1727096478.60175: Set connection var ansible_timeout to 10 27712 1727096478.60178: Set connection var ansible_shell_type to sh 27712 1727096478.60180: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096478.60182: Set connection var ansible_shell_executable to /bin/sh 27712 1727096478.60184: variable 'ansible_shell_executable' from source: unknown 27712 1727096478.60186: variable 'ansible_connection' from source: unknown 27712 1727096478.60188: variable 'ansible_module_compression' from source: unknown 27712 1727096478.60190: variable 'ansible_shell_type' from source: unknown 27712 1727096478.60192: variable 'ansible_shell_executable' from source: unknown 27712 1727096478.60193: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096478.60195: variable 'ansible_pipelining' from source: unknown 27712 1727096478.60198: variable 'ansible_timeout' from source: unknown 27712 1727096478.60199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096478.60206: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096478.60221: variable 'omit' from source: magic vars 27712 1727096478.60231: starting attempt loop 27712 1727096478.60237: running the handler 27712 1727096478.60254: handler run complete 27712 1727096478.60355: attempt loop complete, returning result 27712 1727096478.60363: _execute() done 27712 1727096478.60374: dumping result to json 27712 1727096478.60383: done dumping result, returning 27712 1727096478.60397: done running TaskExecutor() for managed_node2/TASK: Set type and interface0 [0afff68d-5257-cbc7-8716-00000000000b] 27712 1727096478.60426: sending task result for task 0afff68d-5257-cbc7-8716-00000000000b ok: [managed_node2] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 27712 1727096478.60588: no more pending results, returning what we have 27712 1727096478.60592: results queue empty 27712 1727096478.60593: checking for any_errors_fatal 27712 1727096478.60595: done checking for any_errors_fatal 27712 1727096478.60596: checking for max_fail_percentage 27712 1727096478.60597: done checking for max_fail_percentage 27712 1727096478.60601: checking to see if all hosts have failed and the running result is not ok 27712 1727096478.60603: done checking to see if all hosts have failed 27712 1727096478.60603: getting the remaining hosts for this loop 27712 1727096478.60605: done getting the remaining hosts for this loop 27712 1727096478.60611: getting the next task for host managed_node2 27712 1727096478.60617: done getting next task for host managed_node2 27712 1727096478.60620: ^ task is: TASK: Show interfaces 27712 1727096478.60622: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096478.60626: getting variables 27712 1727096478.60628: in VariableManager get_vars() 27712 1727096478.60674: Calling all_inventory to load vars for managed_node2 27712 1727096478.60677: Calling groups_inventory to load vars for managed_node2 27712 1727096478.60680: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096478.60691: Calling all_plugins_play to load vars for managed_node2 27712 1727096478.60694: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096478.60697: Calling groups_plugins_play to load vars for managed_node2 27712 1727096478.61696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096478.62132: done with get_vars() 27712 1727096478.62145: done getting variables 27712 1727096478.62475: done sending task result for task 0afff68d-5257-cbc7-8716-00000000000b 27712 1727096478.62479: WORKER PROCESS EXITING TASK [Show interfaces] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:15 Monday 23 September 2024 09:01:18 -0400 (0:00:00.070) 0:00:04.318 ****** 27712 1727096478.62549: entering _queue_task() for managed_node2/include_tasks 27712 1727096478.63805: worker is 1 (out of 1 available) 27712 1727096478.63817: exiting _queue_task() for managed_node2/include_tasks 27712 1727096478.63829: done queuing things up, now waiting for results queue to drain 27712 1727096478.63830: waiting for pending results... 27712 1727096478.63975: running TaskExecutor() for managed_node2/TASK: Show interfaces 27712 1727096478.64128: in run() - task 0afff68d-5257-cbc7-8716-00000000000c 27712 1727096478.64307: variable 'ansible_search_path' from source: unknown 27712 1727096478.64507: calling self._execute() 27712 1727096478.64726: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096478.64856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096478.64863: variable 'omit' from source: magic vars 27712 1727096478.66058: variable 'ansible_distribution_major_version' from source: facts 27712 1727096478.66136: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096478.66144: _execute() done 27712 1727096478.66219: dumping result to json 27712 1727096478.66226: done dumping result, returning 27712 1727096478.66237: done running TaskExecutor() for managed_node2/TASK: Show interfaces [0afff68d-5257-cbc7-8716-00000000000c] 27712 1727096478.66272: sending task result for task 0afff68d-5257-cbc7-8716-00000000000c 27712 1727096478.66602: done sending task result for task 0afff68d-5257-cbc7-8716-00000000000c 27712 1727096478.66611: WORKER PROCESS EXITING 27712 1727096478.66641: no more pending results, returning what we have 27712 1727096478.66645: in VariableManager get_vars() 27712 1727096478.66695: Calling all_inventory to load vars for managed_node2 27712 1727096478.66698: Calling groups_inventory to load vars for managed_node2 27712 1727096478.66701: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096478.66718: Calling all_plugins_play to load vars for managed_node2 27712 1727096478.66724: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096478.66728: Calling groups_plugins_play to load vars for managed_node2 27712 1727096478.67540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096478.68277: done with get_vars() 27712 1727096478.68316: variable 'ansible_search_path' from source: unknown 27712 1727096478.68331: we have included files to process 27712 1727096478.68332: generating all_blocks data 27712 1727096478.68333: done generating all_blocks data 27712 1727096478.68334: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27712 1727096478.68335: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27712 1727096478.68337: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27712 1727096478.68530: in VariableManager get_vars() 27712 1727096478.68550: done with get_vars() 27712 1727096478.68723: done processing included file 27712 1727096478.68725: iterating over new_blocks loaded from include file 27712 1727096478.68727: in VariableManager get_vars() 27712 1727096478.68747: done with get_vars() 27712 1727096478.68749: filtering new block on tags 27712 1727096478.68766: done filtering new block on tags 27712 1727096478.68770: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 27712 1727096478.68775: extending task lists for all hosts with included blocks 27712 1727096478.68920: done extending task lists 27712 1727096478.68921: done processing included files 27712 1727096478.68922: results queue empty 27712 1727096478.68923: checking for any_errors_fatal 27712 1727096478.68925: done checking for any_errors_fatal 27712 1727096478.68926: checking for max_fail_percentage 27712 1727096478.68927: done checking for max_fail_percentage 27712 1727096478.68928: checking to see if all hosts have failed and the running result is not ok 27712 1727096478.68928: done checking to see if all hosts have failed 27712 1727096478.68929: getting the remaining hosts for this loop 27712 1727096478.68930: done getting the remaining hosts for this loop 27712 1727096478.68932: getting the next task for host managed_node2 27712 1727096478.68936: done getting next task for host managed_node2 27712 1727096478.68938: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 27712 1727096478.68940: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096478.68942: getting variables 27712 1727096478.68943: in VariableManager get_vars() 27712 1727096478.68956: Calling all_inventory to load vars for managed_node2 27712 1727096478.68958: Calling groups_inventory to load vars for managed_node2 27712 1727096478.68960: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096478.68965: Calling all_plugins_play to load vars for managed_node2 27712 1727096478.68970: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096478.68973: Calling groups_plugins_play to load vars for managed_node2 27712 1727096478.69121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096478.69331: done with get_vars() 27712 1727096478.69341: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 09:01:18 -0400 (0:00:00.068) 0:00:04.387 ****** 27712 1727096478.69412: entering _queue_task() for managed_node2/include_tasks 27712 1727096478.69728: worker is 1 (out of 1 available) 27712 1727096478.69850: exiting _queue_task() for managed_node2/include_tasks 27712 1727096478.69864: done queuing things up, now waiting for results queue to drain 27712 1727096478.69866: waiting for pending results... 27712 1727096478.70530: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 27712 1727096478.70545: in run() - task 0afff68d-5257-cbc7-8716-000000000135 27712 1727096478.70564: variable 'ansible_search_path' from source: unknown 27712 1727096478.70582: variable 'ansible_search_path' from source: unknown 27712 1727096478.70669: calling self._execute() 27712 1727096478.70829: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096478.70940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096478.70959: variable 'omit' from source: magic vars 27712 1727096478.71474: variable 'ansible_distribution_major_version' from source: facts 27712 1727096478.71497: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096478.71548: _execute() done 27712 1727096478.71577: dumping result to json 27712 1727096478.71621: done dumping result, returning 27712 1727096478.71645: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-cbc7-8716-000000000135] 27712 1727096478.71679: sending task result for task 0afff68d-5257-cbc7-8716-000000000135 27712 1727096478.71815: done sending task result for task 0afff68d-5257-cbc7-8716-000000000135 27712 1727096478.71857: no more pending results, returning what we have 27712 1727096478.71862: in VariableManager get_vars() 27712 1727096478.71921: Calling all_inventory to load vars for managed_node2 27712 1727096478.71924: Calling groups_inventory to load vars for managed_node2 27712 1727096478.71927: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096478.71943: Calling all_plugins_play to load vars for managed_node2 27712 1727096478.71949: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096478.71953: Calling groups_plugins_play to load vars for managed_node2 27712 1727096478.72738: WORKER PROCESS EXITING 27712 1727096478.72844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096478.73211: done with get_vars() 27712 1727096478.73220: variable 'ansible_search_path' from source: unknown 27712 1727096478.73221: variable 'ansible_search_path' from source: unknown 27712 1727096478.73261: we have included files to process 27712 1727096478.73262: generating all_blocks data 27712 1727096478.73264: done generating all_blocks data 27712 1727096478.73265: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27712 1727096478.73266: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27712 1727096478.73293: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27712 1727096478.73920: done processing included file 27712 1727096478.73923: iterating over new_blocks loaded from include file 27712 1727096478.73924: in VariableManager get_vars() 27712 1727096478.73943: done with get_vars() 27712 1727096478.73945: filtering new block on tags 27712 1727096478.73962: done filtering new block on tags 27712 1727096478.73965: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 27712 1727096478.73971: extending task lists for all hosts with included blocks 27712 1727096478.74127: done extending task lists 27712 1727096478.74128: done processing included files 27712 1727096478.74129: results queue empty 27712 1727096478.74130: checking for any_errors_fatal 27712 1727096478.74133: done checking for any_errors_fatal 27712 1727096478.74134: checking for max_fail_percentage 27712 1727096478.74135: done checking for max_fail_percentage 27712 1727096478.74136: checking to see if all hosts have failed and the running result is not ok 27712 1727096478.74136: done checking to see if all hosts have failed 27712 1727096478.74137: getting the remaining hosts for this loop 27712 1727096478.74138: done getting the remaining hosts for this loop 27712 1727096478.74141: getting the next task for host managed_node2 27712 1727096478.74145: done getting next task for host managed_node2 27712 1727096478.74148: ^ task is: TASK: Gather current interface info 27712 1727096478.74151: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096478.74153: getting variables 27712 1727096478.74154: in VariableManager get_vars() 27712 1727096478.74172: Calling all_inventory to load vars for managed_node2 27712 1727096478.74175: Calling groups_inventory to load vars for managed_node2 27712 1727096478.74177: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096478.74186: Calling all_plugins_play to load vars for managed_node2 27712 1727096478.74189: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096478.74192: Calling groups_plugins_play to load vars for managed_node2 27712 1727096478.74397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096478.74578: done with get_vars() 27712 1727096478.74588: done getting variables 27712 1727096478.74625: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 09:01:18 -0400 (0:00:00.052) 0:00:04.439 ****** 27712 1727096478.74652: entering _queue_task() for managed_node2/command 27712 1727096478.74966: worker is 1 (out of 1 available) 27712 1727096478.74980: exiting _queue_task() for managed_node2/command 27712 1727096478.75115: done queuing things up, now waiting for results queue to drain 27712 1727096478.75116: waiting for pending results... 27712 1727096478.75284: running TaskExecutor() for managed_node2/TASK: Gather current interface info 27712 1727096478.75516: in run() - task 0afff68d-5257-cbc7-8716-00000000014e 27712 1727096478.75591: variable 'ansible_search_path' from source: unknown 27712 1727096478.75595: variable 'ansible_search_path' from source: unknown 27712 1727096478.75630: calling self._execute() 27712 1727096478.75801: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096478.75873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096478.75876: variable 'omit' from source: magic vars 27712 1727096478.76262: variable 'ansible_distribution_major_version' from source: facts 27712 1727096478.76286: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096478.76303: variable 'omit' from source: magic vars 27712 1727096478.76347: variable 'omit' from source: magic vars 27712 1727096478.76407: variable 'omit' from source: magic vars 27712 1727096478.76433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096478.76474: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096478.76573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096478.76576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096478.76578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096478.76599: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096478.76609: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096478.76732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096478.76735: Set connection var ansible_connection to ssh 27712 1727096478.76739: Set connection var ansible_pipelining to False 27712 1727096478.76742: Set connection var ansible_timeout to 10 27712 1727096478.76749: Set connection var ansible_shell_type to sh 27712 1727096478.76761: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096478.76771: Set connection var ansible_shell_executable to /bin/sh 27712 1727096478.76796: variable 'ansible_shell_executable' from source: unknown 27712 1727096478.76803: variable 'ansible_connection' from source: unknown 27712 1727096478.76809: variable 'ansible_module_compression' from source: unknown 27712 1727096478.76814: variable 'ansible_shell_type' from source: unknown 27712 1727096478.76820: variable 'ansible_shell_executable' from source: unknown 27712 1727096478.76825: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096478.76830: variable 'ansible_pipelining' from source: unknown 27712 1727096478.76840: variable 'ansible_timeout' from source: unknown 27712 1727096478.76849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096478.77000: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096478.77084: variable 'omit' from source: magic vars 27712 1727096478.77191: starting attempt loop 27712 1727096478.77194: running the handler 27712 1727096478.77526: _low_level_execute_command(): starting 27712 1727096478.77529: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096478.78789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096478.78825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096478.78841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096478.78896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096478.78977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096478.78997: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096478.79101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096478.79174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096478.79277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096478.81137: stdout chunk (state=3): >>>/root <<< 27712 1727096478.81141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096478.81336: stderr chunk (state=3): >>><<< 27712 1727096478.81339: stdout chunk (state=3): >>><<< 27712 1727096478.81729: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096478.81734: _low_level_execute_command(): starting 27712 1727096478.81737: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096478.8159301-27973-39940042122436 `" && echo ansible-tmp-1727096478.8159301-27973-39940042122436="` echo /root/.ansible/tmp/ansible-tmp-1727096478.8159301-27973-39940042122436 `" ) && sleep 0' 27712 1727096478.83886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096478.84180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096478.84184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096478.84186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096478.84199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096478.86432: stdout chunk (state=3): >>>ansible-tmp-1727096478.8159301-27973-39940042122436=/root/.ansible/tmp/ansible-tmp-1727096478.8159301-27973-39940042122436 <<< 27712 1727096478.86534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096478.86707: stderr chunk (state=3): >>><<< 27712 1727096478.86714: stdout chunk (state=3): >>><<< 27712 1727096478.86736: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096478.8159301-27973-39940042122436=/root/.ansible/tmp/ansible-tmp-1727096478.8159301-27973-39940042122436 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096478.86772: variable 'ansible_module_compression' from source: unknown 27712 1727096478.86916: ANSIBALLZ: Using generic lock for ansible.legacy.command 27712 1727096478.86920: ANSIBALLZ: Acquiring lock 27712 1727096478.86923: ANSIBALLZ: Lock acquired: 140297911472480 27712 1727096478.86925: ANSIBALLZ: Creating module 27712 1727096479.09039: ANSIBALLZ: Writing module into payload 27712 1727096479.09138: ANSIBALLZ: Writing module 27712 1727096479.09174: ANSIBALLZ: Renaming module 27712 1727096479.09177: ANSIBALLZ: Done creating module 27712 1727096479.09198: variable 'ansible_facts' from source: unknown 27712 1727096479.09272: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096478.8159301-27973-39940042122436/AnsiballZ_command.py 27712 1727096479.09706: Sending initial data 27712 1727096479.09708: Sent initial data (155 bytes) 27712 1727096479.10961: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096479.11084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096479.11104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096479.11181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096479.12822: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 27712 1727096479.12836: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 27712 1727096479.12846: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 27712 1727096479.12854: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 27712 1727096479.12870: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096479.12915: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096479.13066: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpjvcx3c7r /root/.ansible/tmp/ansible-tmp-1727096478.8159301-27973-39940042122436/AnsiballZ_command.py <<< 27712 1727096479.13073: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096478.8159301-27973-39940042122436/AnsiballZ_command.py" <<< 27712 1727096479.13096: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpjvcx3c7r" to remote "/root/.ansible/tmp/ansible-tmp-1727096478.8159301-27973-39940042122436/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096478.8159301-27973-39940042122436/AnsiballZ_command.py" <<< 27712 1727096479.13883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096479.13887: stdout chunk (state=3): >>><<< 27712 1727096479.13895: stderr chunk (state=3): >>><<< 27712 1727096479.13956: done transferring module to remote 27712 1727096479.13969: _low_level_execute_command(): starting 27712 1727096479.13982: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096478.8159301-27973-39940042122436/ /root/.ansible/tmp/ansible-tmp-1727096478.8159301-27973-39940042122436/AnsiballZ_command.py && sleep 0' 27712 1727096479.14674: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096479.14677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096479.14680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096479.14682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096479.14691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096479.14699: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096479.14815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096479.14818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096479.14821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096479.14828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096479.14842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096479.14903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096479.16786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096479.16840: stderr chunk (state=3): >>><<< 27712 1727096479.16843: stdout chunk (state=3): >>><<< 27712 1727096479.16861: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096479.16865: _low_level_execute_command(): starting 27712 1727096479.16869: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096478.8159301-27973-39940042122436/AnsiballZ_command.py && sleep 0' 27712 1727096479.17538: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096479.17581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096479.17584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096479.17587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096479.17589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096479.17597: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096479.17691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096479.17694: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096479.17696: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096479.17698: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27712 1727096479.17700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096479.17701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096479.17703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096479.17752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096479.17764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096479.17789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096479.17872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096479.33680: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:01:19.332420", "end": "2024-09-23 09:01:19.335788", "delta": "0:00:00.003368", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096479.35390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096479.35398: stderr chunk (state=3): >>><<< 27712 1727096479.35402: stdout chunk (state=3): >>><<< 27712 1727096479.35405: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:01:19.332420", "end": "2024-09-23 09:01:19.335788", "delta": "0:00:00.003368", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096479.35442: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096478.8159301-27973-39940042122436/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096479.35446: _low_level_execute_command(): starting 27712 1727096479.35449: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096478.8159301-27973-39940042122436/ > /dev/null 2>&1 && sleep 0' 27712 1727096479.36042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096479.36046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096479.36053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096479.36056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096479.36115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096479.36125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096479.36157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096479.38476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096479.38480: stdout chunk (state=3): >>><<< 27712 1727096479.38483: stderr chunk (state=3): >>><<< 27712 1727096479.38485: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096479.38487: handler run complete 27712 1727096479.38489: Evaluated conditional (False): False 27712 1727096479.38490: attempt loop complete, returning result 27712 1727096479.38492: _execute() done 27712 1727096479.38494: dumping result to json 27712 1727096479.38495: done dumping result, returning 27712 1727096479.38497: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0afff68d-5257-cbc7-8716-00000000014e] 27712 1727096479.38499: sending task result for task 0afff68d-5257-cbc7-8716-00000000014e ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003368", "end": "2024-09-23 09:01:19.335788", "rc": 0, "start": "2024-09-23 09:01:19.332420" } STDOUT: bonding_masters eth0 lo 27712 1727096479.38652: no more pending results, returning what we have 27712 1727096479.38656: results queue empty 27712 1727096479.38657: checking for any_errors_fatal 27712 1727096479.38658: done checking for any_errors_fatal 27712 1727096479.38659: checking for max_fail_percentage 27712 1727096479.38660: done checking for max_fail_percentage 27712 1727096479.38661: checking to see if all hosts have failed and the running result is not ok 27712 1727096479.38662: done checking to see if all hosts have failed 27712 1727096479.38662: getting the remaining hosts for this loop 27712 1727096479.38664: done getting the remaining hosts for this loop 27712 1727096479.38670: getting the next task for host managed_node2 27712 1727096479.38677: done getting next task for host managed_node2 27712 1727096479.38680: ^ task is: TASK: Set current_interfaces 27712 1727096479.38684: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096479.38688: getting variables 27712 1727096479.38689: in VariableManager get_vars() 27712 1727096479.38729: Calling all_inventory to load vars for managed_node2 27712 1727096479.38732: Calling groups_inventory to load vars for managed_node2 27712 1727096479.38734: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096479.38748: Calling all_plugins_play to load vars for managed_node2 27712 1727096479.38752: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096479.38758: Calling groups_plugins_play to load vars for managed_node2 27712 1727096479.39332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096479.40002: done sending task result for task 0afff68d-5257-cbc7-8716-00000000014e 27712 1727096479.40006: WORKER PROCESS EXITING 27712 1727096479.40144: done with get_vars() 27712 1727096479.40157: done getting variables 27712 1727096479.40418: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 09:01:19 -0400 (0:00:00.657) 0:00:05.097 ****** 27712 1727096479.40448: entering _queue_task() for managed_node2/set_fact 27712 1727096479.41170: worker is 1 (out of 1 available) 27712 1727096479.41178: exiting _queue_task() for managed_node2/set_fact 27712 1727096479.41188: done queuing things up, now waiting for results queue to drain 27712 1727096479.41189: waiting for pending results... 27712 1727096479.41217: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 27712 1727096479.41330: in run() - task 0afff68d-5257-cbc7-8716-00000000014f 27712 1727096479.41355: variable 'ansible_search_path' from source: unknown 27712 1727096479.41362: variable 'ansible_search_path' from source: unknown 27712 1727096479.41401: calling self._execute() 27712 1727096479.41488: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.41500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.41514: variable 'omit' from source: magic vars 27712 1727096479.41874: variable 'ansible_distribution_major_version' from source: facts 27712 1727096479.41891: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096479.41901: variable 'omit' from source: magic vars 27712 1727096479.41944: variable 'omit' from source: magic vars 27712 1727096479.42051: variable '_current_interfaces' from source: set_fact 27712 1727096479.42121: variable 'omit' from source: magic vars 27712 1727096479.42164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096479.42207: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096479.42283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096479.42292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096479.42294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096479.42304: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096479.42312: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.42318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.42423: Set connection var ansible_connection to ssh 27712 1727096479.42436: Set connection var ansible_pipelining to False 27712 1727096479.42445: Set connection var ansible_timeout to 10 27712 1727096479.42451: Set connection var ansible_shell_type to sh 27712 1727096479.42461: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096479.42471: Set connection var ansible_shell_executable to /bin/sh 27712 1727096479.42500: variable 'ansible_shell_executable' from source: unknown 27712 1727096479.42608: variable 'ansible_connection' from source: unknown 27712 1727096479.42611: variable 'ansible_module_compression' from source: unknown 27712 1727096479.42613: variable 'ansible_shell_type' from source: unknown 27712 1727096479.42615: variable 'ansible_shell_executable' from source: unknown 27712 1727096479.42617: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.42619: variable 'ansible_pipelining' from source: unknown 27712 1727096479.42621: variable 'ansible_timeout' from source: unknown 27712 1727096479.42623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.42680: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096479.42696: variable 'omit' from source: magic vars 27712 1727096479.42705: starting attempt loop 27712 1727096479.42712: running the handler 27712 1727096479.42729: handler run complete 27712 1727096479.42742: attempt loop complete, returning result 27712 1727096479.42748: _execute() done 27712 1727096479.42828: dumping result to json 27712 1727096479.42831: done dumping result, returning 27712 1727096479.42834: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0afff68d-5257-cbc7-8716-00000000014f] 27712 1727096479.42836: sending task result for task 0afff68d-5257-cbc7-8716-00000000014f 27712 1727096479.42900: done sending task result for task 0afff68d-5257-cbc7-8716-00000000014f 27712 1727096479.42903: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 27712 1727096479.42987: no more pending results, returning what we have 27712 1727096479.42990: results queue empty 27712 1727096479.42991: checking for any_errors_fatal 27712 1727096479.42998: done checking for any_errors_fatal 27712 1727096479.42998: checking for max_fail_percentage 27712 1727096479.43000: done checking for max_fail_percentage 27712 1727096479.43000: checking to see if all hosts have failed and the running result is not ok 27712 1727096479.43001: done checking to see if all hosts have failed 27712 1727096479.43002: getting the remaining hosts for this loop 27712 1727096479.43003: done getting the remaining hosts for this loop 27712 1727096479.43007: getting the next task for host managed_node2 27712 1727096479.43015: done getting next task for host managed_node2 27712 1727096479.43017: ^ task is: TASK: Show current_interfaces 27712 1727096479.43021: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096479.43025: getting variables 27712 1727096479.43026: in VariableManager get_vars() 27712 1727096479.43068: Calling all_inventory to load vars for managed_node2 27712 1727096479.43071: Calling groups_inventory to load vars for managed_node2 27712 1727096479.43074: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096479.43085: Calling all_plugins_play to load vars for managed_node2 27712 1727096479.43090: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096479.43093: Calling groups_plugins_play to load vars for managed_node2 27712 1727096479.43469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096479.43664: done with get_vars() 27712 1727096479.43677: done getting variables 27712 1727096479.43764: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 09:01:19 -0400 (0:00:00.033) 0:00:05.131 ****** 27712 1727096479.43794: entering _queue_task() for managed_node2/debug 27712 1727096479.43795: Creating lock for debug 27712 1727096479.44283: worker is 1 (out of 1 available) 27712 1727096479.44292: exiting _queue_task() for managed_node2/debug 27712 1727096479.44302: done queuing things up, now waiting for results queue to drain 27712 1727096479.44303: waiting for pending results... 27712 1727096479.44687: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 27712 1727096479.44792: in run() - task 0afff68d-5257-cbc7-8716-000000000136 27712 1727096479.44972: variable 'ansible_search_path' from source: unknown 27712 1727096479.44977: variable 'ansible_search_path' from source: unknown 27712 1727096479.44979: calling self._execute() 27712 1727096479.45278: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.45281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.45284: variable 'omit' from source: magic vars 27712 1727096479.45571: variable 'ansible_distribution_major_version' from source: facts 27712 1727096479.45591: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096479.45602: variable 'omit' from source: magic vars 27712 1727096479.45642: variable 'omit' from source: magic vars 27712 1727096479.45739: variable 'current_interfaces' from source: set_fact 27712 1727096479.45776: variable 'omit' from source: magic vars 27712 1727096479.45822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096479.45878: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096479.45905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096479.45929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096479.45947: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096479.45997: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096479.46006: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.46014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.46117: Set connection var ansible_connection to ssh 27712 1727096479.46132: Set connection var ansible_pipelining to False 27712 1727096479.46143: Set connection var ansible_timeout to 10 27712 1727096479.46150: Set connection var ansible_shell_type to sh 27712 1727096479.46162: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096479.46174: Set connection var ansible_shell_executable to /bin/sh 27712 1727096479.46210: variable 'ansible_shell_executable' from source: unknown 27712 1727096479.46220: variable 'ansible_connection' from source: unknown 27712 1727096479.46227: variable 'ansible_module_compression' from source: unknown 27712 1727096479.46233: variable 'ansible_shell_type' from source: unknown 27712 1727096479.46239: variable 'ansible_shell_executable' from source: unknown 27712 1727096479.46246: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.46254: variable 'ansible_pipelining' from source: unknown 27712 1727096479.46260: variable 'ansible_timeout' from source: unknown 27712 1727096479.46267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.46412: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096479.46431: variable 'omit' from source: magic vars 27712 1727096479.46442: starting attempt loop 27712 1727096479.46449: running the handler 27712 1727096479.46498: handler run complete 27712 1727096479.46517: attempt loop complete, returning result 27712 1727096479.46524: _execute() done 27712 1727096479.46532: dumping result to json 27712 1727096479.46538: done dumping result, returning 27712 1727096479.46549: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0afff68d-5257-cbc7-8716-000000000136] 27712 1727096479.46559: sending task result for task 0afff68d-5257-cbc7-8716-000000000136 ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 27712 1727096479.46790: no more pending results, returning what we have 27712 1727096479.46794: results queue empty 27712 1727096479.46795: checking for any_errors_fatal 27712 1727096479.46798: done checking for any_errors_fatal 27712 1727096479.46799: checking for max_fail_percentage 27712 1727096479.46800: done checking for max_fail_percentage 27712 1727096479.46801: checking to see if all hosts have failed and the running result is not ok 27712 1727096479.46802: done checking to see if all hosts have failed 27712 1727096479.46802: getting the remaining hosts for this loop 27712 1727096479.46804: done getting the remaining hosts for this loop 27712 1727096479.46807: getting the next task for host managed_node2 27712 1727096479.46822: done getting next task for host managed_node2 27712 1727096479.46825: ^ task is: TASK: Manage test interface 27712 1727096479.46828: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096479.46831: getting variables 27712 1727096479.46833: in VariableManager get_vars() 27712 1727096479.46874: Calling all_inventory to load vars for managed_node2 27712 1727096479.46877: Calling groups_inventory to load vars for managed_node2 27712 1727096479.46879: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096479.46893: Calling all_plugins_play to load vars for managed_node2 27712 1727096479.46896: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096479.46900: Calling groups_plugins_play to load vars for managed_node2 27712 1727096479.47152: done sending task result for task 0afff68d-5257-cbc7-8716-000000000136 27712 1727096479.47155: WORKER PROCESS EXITING 27712 1727096479.47181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096479.47373: done with get_vars() 27712 1727096479.47384: done getting variables TASK [Manage test interface] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:17 Monday 23 September 2024 09:01:19 -0400 (0:00:00.036) 0:00:05.168 ****** 27712 1727096479.47475: entering _queue_task() for managed_node2/include_tasks 27712 1727096479.47754: worker is 1 (out of 1 available) 27712 1727096479.47766: exiting _queue_task() for managed_node2/include_tasks 27712 1727096479.47781: done queuing things up, now waiting for results queue to drain 27712 1727096479.47783: waiting for pending results... 27712 1727096479.47981: running TaskExecutor() for managed_node2/TASK: Manage test interface 27712 1727096479.48061: in run() - task 0afff68d-5257-cbc7-8716-00000000000d 27712 1727096479.48077: variable 'ansible_search_path' from source: unknown 27712 1727096479.48111: calling self._execute() 27712 1727096479.48196: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.48201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.48210: variable 'omit' from source: magic vars 27712 1727096479.48574: variable 'ansible_distribution_major_version' from source: facts 27712 1727096479.48589: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096479.48605: _execute() done 27712 1727096479.48608: dumping result to json 27712 1727096479.48611: done dumping result, returning 27712 1727096479.48618: done running TaskExecutor() for managed_node2/TASK: Manage test interface [0afff68d-5257-cbc7-8716-00000000000d] 27712 1727096479.48623: sending task result for task 0afff68d-5257-cbc7-8716-00000000000d 27712 1727096479.48711: done sending task result for task 0afff68d-5257-cbc7-8716-00000000000d 27712 1727096479.48714: WORKER PROCESS EXITING 27712 1727096479.48774: no more pending results, returning what we have 27712 1727096479.48778: in VariableManager get_vars() 27712 1727096479.48821: Calling all_inventory to load vars for managed_node2 27712 1727096479.48825: Calling groups_inventory to load vars for managed_node2 27712 1727096479.48827: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096479.48837: Calling all_plugins_play to load vars for managed_node2 27712 1727096479.48840: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096479.48842: Calling groups_plugins_play to load vars for managed_node2 27712 1727096479.49048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096479.49246: done with get_vars() 27712 1727096479.49253: variable 'ansible_search_path' from source: unknown 27712 1727096479.49266: we have included files to process 27712 1727096479.49270: generating all_blocks data 27712 1727096479.49272: done generating all_blocks data 27712 1727096479.49276: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27712 1727096479.49277: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27712 1727096479.49280: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27712 1727096479.49876: in VariableManager get_vars() 27712 1727096479.49907: done with get_vars() 27712 1727096479.50137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 27712 1727096479.50728: done processing included file 27712 1727096479.50730: iterating over new_blocks loaded from include file 27712 1727096479.50731: in VariableManager get_vars() 27712 1727096479.50762: done with get_vars() 27712 1727096479.50765: filtering new block on tags 27712 1727096479.50797: done filtering new block on tags 27712 1727096479.50800: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 27712 1727096479.50985: extending task lists for all hosts with included blocks 27712 1727096479.51247: done extending task lists 27712 1727096479.51248: done processing included files 27712 1727096479.51249: results queue empty 27712 1727096479.51250: checking for any_errors_fatal 27712 1727096479.51252: done checking for any_errors_fatal 27712 1727096479.51253: checking for max_fail_percentage 27712 1727096479.51254: done checking for max_fail_percentage 27712 1727096479.51255: checking to see if all hosts have failed and the running result is not ok 27712 1727096479.51255: done checking to see if all hosts have failed 27712 1727096479.51256: getting the remaining hosts for this loop 27712 1727096479.51257: done getting the remaining hosts for this loop 27712 1727096479.51260: getting the next task for host managed_node2 27712 1727096479.51263: done getting next task for host managed_node2 27712 1727096479.51266: ^ task is: TASK: Ensure state in ["present", "absent"] 27712 1727096479.51270: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096479.51272: getting variables 27712 1727096479.51273: in VariableManager get_vars() 27712 1727096479.51287: Calling all_inventory to load vars for managed_node2 27712 1727096479.51295: Calling groups_inventory to load vars for managed_node2 27712 1727096479.51298: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096479.51303: Calling all_plugins_play to load vars for managed_node2 27712 1727096479.51306: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096479.51309: Calling groups_plugins_play to load vars for managed_node2 27712 1727096479.51471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096479.51708: done with get_vars() 27712 1727096479.51717: done getting variables 27712 1727096479.51781: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Monday 23 September 2024 09:01:19 -0400 (0:00:00.043) 0:00:05.211 ****** 27712 1727096479.51832: entering _queue_task() for managed_node2/fail 27712 1727096479.51838: Creating lock for fail 27712 1727096479.52245: worker is 1 (out of 1 available) 27712 1727096479.52257: exiting _queue_task() for managed_node2/fail 27712 1727096479.52471: done queuing things up, now waiting for results queue to drain 27712 1727096479.52473: waiting for pending results... 27712 1727096479.52593: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 27712 1727096479.52673: in run() - task 0afff68d-5257-cbc7-8716-00000000016a 27712 1727096479.52677: variable 'ansible_search_path' from source: unknown 27712 1727096479.52680: variable 'ansible_search_path' from source: unknown 27712 1727096479.52796: calling self._execute() 27712 1727096479.52799: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.52802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.52809: variable 'omit' from source: magic vars 27712 1727096479.53256: variable 'ansible_distribution_major_version' from source: facts 27712 1727096479.53273: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096479.53379: variable 'state' from source: include params 27712 1727096479.53387: Evaluated conditional (state not in ["present", "absent"]): False 27712 1727096479.53395: when evaluation is False, skipping this task 27712 1727096479.53402: _execute() done 27712 1727096479.53404: dumping result to json 27712 1727096479.53407: done dumping result, returning 27712 1727096479.53413: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [0afff68d-5257-cbc7-8716-00000000016a] 27712 1727096479.53418: sending task result for task 0afff68d-5257-cbc7-8716-00000000016a skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 27712 1727096479.53540: no more pending results, returning what we have 27712 1727096479.53544: results queue empty 27712 1727096479.53545: checking for any_errors_fatal 27712 1727096479.53546: done checking for any_errors_fatal 27712 1727096479.53546: checking for max_fail_percentage 27712 1727096479.53548: done checking for max_fail_percentage 27712 1727096479.53549: checking to see if all hosts have failed and the running result is not ok 27712 1727096479.53550: done checking to see if all hosts have failed 27712 1727096479.53550: getting the remaining hosts for this loop 27712 1727096479.53551: done getting the remaining hosts for this loop 27712 1727096479.53555: getting the next task for host managed_node2 27712 1727096479.53560: done getting next task for host managed_node2 27712 1727096479.53562: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 27712 1727096479.53566: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096479.53571: getting variables 27712 1727096479.53572: in VariableManager get_vars() 27712 1727096479.53612: Calling all_inventory to load vars for managed_node2 27712 1727096479.53615: Calling groups_inventory to load vars for managed_node2 27712 1727096479.53617: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096479.53629: Calling all_plugins_play to load vars for managed_node2 27712 1727096479.53631: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096479.53633: Calling groups_plugins_play to load vars for managed_node2 27712 1727096479.53801: done sending task result for task 0afff68d-5257-cbc7-8716-00000000016a 27712 1727096479.53804: WORKER PROCESS EXITING 27712 1727096479.53819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096479.53959: done with get_vars() 27712 1727096479.53966: done getting variables 27712 1727096479.54006: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Monday 23 September 2024 09:01:19 -0400 (0:00:00.022) 0:00:05.233 ****** 27712 1727096479.54027: entering _queue_task() for managed_node2/fail 27712 1727096479.54216: worker is 1 (out of 1 available) 27712 1727096479.54227: exiting _queue_task() for managed_node2/fail 27712 1727096479.54239: done queuing things up, now waiting for results queue to drain 27712 1727096479.54240: waiting for pending results... 27712 1727096479.54404: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 27712 1727096479.54472: in run() - task 0afff68d-5257-cbc7-8716-00000000016b 27712 1727096479.54485: variable 'ansible_search_path' from source: unknown 27712 1727096479.54489: variable 'ansible_search_path' from source: unknown 27712 1727096479.54514: calling self._execute() 27712 1727096479.54589: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.54592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.54601: variable 'omit' from source: magic vars 27712 1727096479.54874: variable 'ansible_distribution_major_version' from source: facts 27712 1727096479.54895: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096479.55200: variable 'type' from source: set_fact 27712 1727096479.55204: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 27712 1727096479.55206: when evaluation is False, skipping this task 27712 1727096479.55209: _execute() done 27712 1727096479.55211: dumping result to json 27712 1727096479.55214: done dumping result, returning 27712 1727096479.55216: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [0afff68d-5257-cbc7-8716-00000000016b] 27712 1727096479.55218: sending task result for task 0afff68d-5257-cbc7-8716-00000000016b 27712 1727096479.55287: done sending task result for task 0afff68d-5257-cbc7-8716-00000000016b 27712 1727096479.55290: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 27712 1727096479.55352: no more pending results, returning what we have 27712 1727096479.55356: results queue empty 27712 1727096479.55357: checking for any_errors_fatal 27712 1727096479.55362: done checking for any_errors_fatal 27712 1727096479.55363: checking for max_fail_percentage 27712 1727096479.55365: done checking for max_fail_percentage 27712 1727096479.55365: checking to see if all hosts have failed and the running result is not ok 27712 1727096479.55366: done checking to see if all hosts have failed 27712 1727096479.55369: getting the remaining hosts for this loop 27712 1727096479.55370: done getting the remaining hosts for this loop 27712 1727096479.55374: getting the next task for host managed_node2 27712 1727096479.55381: done getting next task for host managed_node2 27712 1727096479.55383: ^ task is: TASK: Include the task 'show_interfaces.yml' 27712 1727096479.55387: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096479.55392: getting variables 27712 1727096479.55393: in VariableManager get_vars() 27712 1727096479.55437: Calling all_inventory to load vars for managed_node2 27712 1727096479.55440: Calling groups_inventory to load vars for managed_node2 27712 1727096479.55443: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096479.55457: Calling all_plugins_play to load vars for managed_node2 27712 1727096479.55459: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096479.55462: Calling groups_plugins_play to load vars for managed_node2 27712 1727096479.55914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096479.56105: done with get_vars() 27712 1727096479.56116: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Monday 23 September 2024 09:01:19 -0400 (0:00:00.021) 0:00:05.255 ****** 27712 1727096479.56218: entering _queue_task() for managed_node2/include_tasks 27712 1727096479.56505: worker is 1 (out of 1 available) 27712 1727096479.56516: exiting _queue_task() for managed_node2/include_tasks 27712 1727096479.56529: done queuing things up, now waiting for results queue to drain 27712 1727096479.56530: waiting for pending results... 27712 1727096479.56753: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 27712 1727096479.56881: in run() - task 0afff68d-5257-cbc7-8716-00000000016c 27712 1727096479.56910: variable 'ansible_search_path' from source: unknown 27712 1727096479.56919: variable 'ansible_search_path' from source: unknown 27712 1727096479.56959: calling self._execute() 27712 1727096479.57060: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.57075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.57110: variable 'omit' from source: magic vars 27712 1727096479.57481: variable 'ansible_distribution_major_version' from source: facts 27712 1727096479.57573: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096479.57576: _execute() done 27712 1727096479.57579: dumping result to json 27712 1727096479.57580: done dumping result, returning 27712 1727096479.57582: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-cbc7-8716-00000000016c] 27712 1727096479.57584: sending task result for task 0afff68d-5257-cbc7-8716-00000000016c 27712 1727096479.57643: done sending task result for task 0afff68d-5257-cbc7-8716-00000000016c 27712 1727096479.57646: WORKER PROCESS EXITING 27712 1727096479.57676: no more pending results, returning what we have 27712 1727096479.57679: in VariableManager get_vars() 27712 1727096479.57715: Calling all_inventory to load vars for managed_node2 27712 1727096479.57718: Calling groups_inventory to load vars for managed_node2 27712 1727096479.57720: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096479.57728: Calling all_plugins_play to load vars for managed_node2 27712 1727096479.57731: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096479.57733: Calling groups_plugins_play to load vars for managed_node2 27712 1727096479.57887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096479.58001: done with get_vars() 27712 1727096479.58006: variable 'ansible_search_path' from source: unknown 27712 1727096479.58007: variable 'ansible_search_path' from source: unknown 27712 1727096479.58029: we have included files to process 27712 1727096479.58030: generating all_blocks data 27712 1727096479.58031: done generating all_blocks data 27712 1727096479.58034: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27712 1727096479.58034: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27712 1727096479.58036: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27712 1727096479.58108: in VariableManager get_vars() 27712 1727096479.58122: done with get_vars() 27712 1727096479.58217: done processing included file 27712 1727096479.58219: iterating over new_blocks loaded from include file 27712 1727096479.58219: in VariableManager get_vars() 27712 1727096479.58231: done with get_vars() 27712 1727096479.58232: filtering new block on tags 27712 1727096479.58243: done filtering new block on tags 27712 1727096479.58244: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 27712 1727096479.58249: extending task lists for all hosts with included blocks 27712 1727096479.58633: done extending task lists 27712 1727096479.58635: done processing included files 27712 1727096479.58636: results queue empty 27712 1727096479.58636: checking for any_errors_fatal 27712 1727096479.58639: done checking for any_errors_fatal 27712 1727096479.58640: checking for max_fail_percentage 27712 1727096479.58641: done checking for max_fail_percentage 27712 1727096479.58642: checking to see if all hosts have failed and the running result is not ok 27712 1727096479.58643: done checking to see if all hosts have failed 27712 1727096479.58643: getting the remaining hosts for this loop 27712 1727096479.58645: done getting the remaining hosts for this loop 27712 1727096479.58648: getting the next task for host managed_node2 27712 1727096479.58652: done getting next task for host managed_node2 27712 1727096479.58654: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 27712 1727096479.58657: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096479.58659: getting variables 27712 1727096479.58660: in VariableManager get_vars() 27712 1727096479.58677: Calling all_inventory to load vars for managed_node2 27712 1727096479.58680: Calling groups_inventory to load vars for managed_node2 27712 1727096479.58682: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096479.58687: Calling all_plugins_play to load vars for managed_node2 27712 1727096479.58689: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096479.58692: Calling groups_plugins_play to load vars for managed_node2 27712 1727096479.58851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096479.59057: done with get_vars() 27712 1727096479.59070: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 09:01:19 -0400 (0:00:00.029) 0:00:05.284 ****** 27712 1727096479.59147: entering _queue_task() for managed_node2/include_tasks 27712 1727096479.59418: worker is 1 (out of 1 available) 27712 1727096479.59431: exiting _queue_task() for managed_node2/include_tasks 27712 1727096479.59442: done queuing things up, now waiting for results queue to drain 27712 1727096479.59443: waiting for pending results... 27712 1727096479.59747: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 27712 1727096479.59752: in run() - task 0afff68d-5257-cbc7-8716-00000000019d 27712 1727096479.59755: variable 'ansible_search_path' from source: unknown 27712 1727096479.59758: variable 'ansible_search_path' from source: unknown 27712 1727096479.59761: calling self._execute() 27712 1727096479.59850: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.59854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.59856: variable 'omit' from source: magic vars 27712 1727096479.60149: variable 'ansible_distribution_major_version' from source: facts 27712 1727096479.60158: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096479.60163: _execute() done 27712 1727096479.60166: dumping result to json 27712 1727096479.60175: done dumping result, returning 27712 1727096479.60182: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-cbc7-8716-00000000019d] 27712 1727096479.60187: sending task result for task 0afff68d-5257-cbc7-8716-00000000019d 27712 1727096479.60274: done sending task result for task 0afff68d-5257-cbc7-8716-00000000019d 27712 1727096479.60278: WORKER PROCESS EXITING 27712 1727096479.60303: no more pending results, returning what we have 27712 1727096479.60307: in VariableManager get_vars() 27712 1727096479.60356: Calling all_inventory to load vars for managed_node2 27712 1727096479.60359: Calling groups_inventory to load vars for managed_node2 27712 1727096479.60361: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096479.60372: Calling all_plugins_play to load vars for managed_node2 27712 1727096479.60375: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096479.60378: Calling groups_plugins_play to load vars for managed_node2 27712 1727096479.60508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096479.60631: done with get_vars() 27712 1727096479.60636: variable 'ansible_search_path' from source: unknown 27712 1727096479.60637: variable 'ansible_search_path' from source: unknown 27712 1727096479.60681: we have included files to process 27712 1727096479.60682: generating all_blocks data 27712 1727096479.60683: done generating all_blocks data 27712 1727096479.60684: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27712 1727096479.60684: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27712 1727096479.60686: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27712 1727096479.60853: done processing included file 27712 1727096479.60855: iterating over new_blocks loaded from include file 27712 1727096479.60856: in VariableManager get_vars() 27712 1727096479.60871: done with get_vars() 27712 1727096479.60873: filtering new block on tags 27712 1727096479.60883: done filtering new block on tags 27712 1727096479.60885: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 27712 1727096479.60888: extending task lists for all hosts with included blocks 27712 1727096479.60972: done extending task lists 27712 1727096479.60973: done processing included files 27712 1727096479.60973: results queue empty 27712 1727096479.60974: checking for any_errors_fatal 27712 1727096479.60976: done checking for any_errors_fatal 27712 1727096479.60976: checking for max_fail_percentage 27712 1727096479.60977: done checking for max_fail_percentage 27712 1727096479.60977: checking to see if all hosts have failed and the running result is not ok 27712 1727096479.60978: done checking to see if all hosts have failed 27712 1727096479.60978: getting the remaining hosts for this loop 27712 1727096479.60979: done getting the remaining hosts for this loop 27712 1727096479.60980: getting the next task for host managed_node2 27712 1727096479.60983: done getting next task for host managed_node2 27712 1727096479.60985: ^ task is: TASK: Gather current interface info 27712 1727096479.60987: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096479.60988: getting variables 27712 1727096479.60989: in VariableManager get_vars() 27712 1727096479.60997: Calling all_inventory to load vars for managed_node2 27712 1727096479.60999: Calling groups_inventory to load vars for managed_node2 27712 1727096479.61001: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096479.61005: Calling all_plugins_play to load vars for managed_node2 27712 1727096479.61007: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096479.61009: Calling groups_plugins_play to load vars for managed_node2 27712 1727096479.61111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096479.61218: done with get_vars() 27712 1727096479.61226: done getting variables 27712 1727096479.61251: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 09:01:19 -0400 (0:00:00.021) 0:00:05.306 ****** 27712 1727096479.61273: entering _queue_task() for managed_node2/command 27712 1727096479.61469: worker is 1 (out of 1 available) 27712 1727096479.61480: exiting _queue_task() for managed_node2/command 27712 1727096479.61493: done queuing things up, now waiting for results queue to drain 27712 1727096479.61494: waiting for pending results... 27712 1727096479.61642: running TaskExecutor() for managed_node2/TASK: Gather current interface info 27712 1727096479.61730: in run() - task 0afff68d-5257-cbc7-8716-0000000001d4 27712 1727096479.61738: variable 'ansible_search_path' from source: unknown 27712 1727096479.61752: variable 'ansible_search_path' from source: unknown 27712 1727096479.61790: calling self._execute() 27712 1727096479.61864: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.61868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.61894: variable 'omit' from source: magic vars 27712 1727096479.62373: variable 'ansible_distribution_major_version' from source: facts 27712 1727096479.62377: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096479.62379: variable 'omit' from source: magic vars 27712 1727096479.62382: variable 'omit' from source: magic vars 27712 1727096479.62384: variable 'omit' from source: magic vars 27712 1727096479.62387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096479.62404: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096479.62421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096479.62438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096479.62449: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096479.62481: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096479.62484: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.62486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.62588: Set connection var ansible_connection to ssh 27712 1727096479.62596: Set connection var ansible_pipelining to False 27712 1727096479.62601: Set connection var ansible_timeout to 10 27712 1727096479.62604: Set connection var ansible_shell_type to sh 27712 1727096479.62622: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096479.62625: Set connection var ansible_shell_executable to /bin/sh 27712 1727096479.62637: variable 'ansible_shell_executable' from source: unknown 27712 1727096479.62639: variable 'ansible_connection' from source: unknown 27712 1727096479.62642: variable 'ansible_module_compression' from source: unknown 27712 1727096479.62644: variable 'ansible_shell_type' from source: unknown 27712 1727096479.62647: variable 'ansible_shell_executable' from source: unknown 27712 1727096479.62650: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.62672: variable 'ansible_pipelining' from source: unknown 27712 1727096479.62676: variable 'ansible_timeout' from source: unknown 27712 1727096479.62678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.62841: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096479.62848: variable 'omit' from source: magic vars 27712 1727096479.62850: starting attempt loop 27712 1727096479.62853: running the handler 27712 1727096479.62855: _low_level_execute_command(): starting 27712 1727096479.62857: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096479.63575: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096479.63579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096479.63583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096479.63646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096479.65342: stdout chunk (state=3): >>>/root <<< 27712 1727096479.65432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096479.65465: stderr chunk (state=3): >>><<< 27712 1727096479.65473: stdout chunk (state=3): >>><<< 27712 1727096479.65494: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096479.65506: _low_level_execute_command(): starting 27712 1727096479.65513: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096479.6549454-28041-134388971692382 `" && echo ansible-tmp-1727096479.6549454-28041-134388971692382="` echo /root/.ansible/tmp/ansible-tmp-1727096479.6549454-28041-134388971692382 `" ) && sleep 0' 27712 1727096479.66077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096479.66085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096479.66155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096479.66160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096479.68057: stdout chunk (state=3): >>>ansible-tmp-1727096479.6549454-28041-134388971692382=/root/.ansible/tmp/ansible-tmp-1727096479.6549454-28041-134388971692382 <<< 27712 1727096479.68162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096479.68192: stderr chunk (state=3): >>><<< 27712 1727096479.68195: stdout chunk (state=3): >>><<< 27712 1727096479.68210: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096479.6549454-28041-134388971692382=/root/.ansible/tmp/ansible-tmp-1727096479.6549454-28041-134388971692382 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096479.68237: variable 'ansible_module_compression' from source: unknown 27712 1727096479.68288: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096479.68317: variable 'ansible_facts' from source: unknown 27712 1727096479.68375: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096479.6549454-28041-134388971692382/AnsiballZ_command.py 27712 1727096479.68482: Sending initial data 27712 1727096479.68485: Sent initial data (156 bytes) 27712 1727096479.68949: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096479.68953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096479.68955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096479.68959: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096479.68961: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096479.69007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096479.69010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096479.69014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096479.69054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096479.70689: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096479.70718: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096479.70751: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpw5q5ks80 /root/.ansible/tmp/ansible-tmp-1727096479.6549454-28041-134388971692382/AnsiballZ_command.py <<< 27712 1727096479.70754: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096479.6549454-28041-134388971692382/AnsiballZ_command.py" <<< 27712 1727096479.70792: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpw5q5ks80" to remote "/root/.ansible/tmp/ansible-tmp-1727096479.6549454-28041-134388971692382/AnsiballZ_command.py" <<< 27712 1727096479.70794: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096479.6549454-28041-134388971692382/AnsiballZ_command.py" <<< 27712 1727096479.71280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096479.71325: stderr chunk (state=3): >>><<< 27712 1727096479.71329: stdout chunk (state=3): >>><<< 27712 1727096479.71376: done transferring module to remote 27712 1727096479.71386: _low_level_execute_command(): starting 27712 1727096479.71391: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096479.6549454-28041-134388971692382/ /root/.ansible/tmp/ansible-tmp-1727096479.6549454-28041-134388971692382/AnsiballZ_command.py && sleep 0' 27712 1727096479.71851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096479.71854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096479.71857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096479.71859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096479.71861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096479.71916: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096479.71923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096479.71925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096479.71956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096479.73789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096479.73823: stderr chunk (state=3): >>><<< 27712 1727096479.73826: stdout chunk (state=3): >>><<< 27712 1727096479.73838: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096479.73841: _low_level_execute_command(): starting 27712 1727096479.73847: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096479.6549454-28041-134388971692382/AnsiballZ_command.py && sleep 0' 27712 1727096479.74310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096479.74313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096479.74315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096479.74318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096479.74319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096479.74372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096479.74376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096479.74398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096479.74424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096479.90297: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:01:19.898472", "end": "2024-09-23 09:01:19.901897", "delta": "0:00:00.003425", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096479.91880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096479.91904: stderr chunk (state=3): >>><<< 27712 1727096479.91907: stdout chunk (state=3): >>><<< 27712 1727096479.91925: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:01:19.898472", "end": "2024-09-23 09:01:19.901897", "delta": "0:00:00.003425", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096479.91952: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096479.6549454-28041-134388971692382/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096479.91962: _low_level_execute_command(): starting 27712 1727096479.91969: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096479.6549454-28041-134388971692382/ > /dev/null 2>&1 && sleep 0' 27712 1727096479.92438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096479.92442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096479.92444: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096479.92447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096479.92454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096479.92485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096479.92496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096479.92542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096479.94396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096479.94418: stderr chunk (state=3): >>><<< 27712 1727096479.94421: stdout chunk (state=3): >>><<< 27712 1727096479.94434: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096479.94440: handler run complete 27712 1727096479.94458: Evaluated conditional (False): False 27712 1727096479.94469: attempt loop complete, returning result 27712 1727096479.94475: _execute() done 27712 1727096479.94477: dumping result to json 27712 1727096479.94483: done dumping result, returning 27712 1727096479.94490: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0afff68d-5257-cbc7-8716-0000000001d4] 27712 1727096479.94494: sending task result for task 0afff68d-5257-cbc7-8716-0000000001d4 27712 1727096479.94598: done sending task result for task 0afff68d-5257-cbc7-8716-0000000001d4 27712 1727096479.94601: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003425", "end": "2024-09-23 09:01:19.901897", "rc": 0, "start": "2024-09-23 09:01:19.898472" } STDOUT: bonding_masters eth0 lo 27712 1727096479.94673: no more pending results, returning what we have 27712 1727096479.94676: results queue empty 27712 1727096479.94677: checking for any_errors_fatal 27712 1727096479.94679: done checking for any_errors_fatal 27712 1727096479.94679: checking for max_fail_percentage 27712 1727096479.94681: done checking for max_fail_percentage 27712 1727096479.94681: checking to see if all hosts have failed and the running result is not ok 27712 1727096479.94682: done checking to see if all hosts have failed 27712 1727096479.94683: getting the remaining hosts for this loop 27712 1727096479.94684: done getting the remaining hosts for this loop 27712 1727096479.94687: getting the next task for host managed_node2 27712 1727096479.94694: done getting next task for host managed_node2 27712 1727096479.94696: ^ task is: TASK: Set current_interfaces 27712 1727096479.94701: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096479.94704: getting variables 27712 1727096479.94705: in VariableManager get_vars() 27712 1727096479.94741: Calling all_inventory to load vars for managed_node2 27712 1727096479.94743: Calling groups_inventory to load vars for managed_node2 27712 1727096479.94745: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096479.94755: Calling all_plugins_play to load vars for managed_node2 27712 1727096479.94757: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096479.94759: Calling groups_plugins_play to load vars for managed_node2 27712 1727096479.94906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096479.95049: done with get_vars() 27712 1727096479.95057: done getting variables 27712 1727096479.95102: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 09:01:19 -0400 (0:00:00.338) 0:00:05.644 ****** 27712 1727096479.95125: entering _queue_task() for managed_node2/set_fact 27712 1727096479.95325: worker is 1 (out of 1 available) 27712 1727096479.95339: exiting _queue_task() for managed_node2/set_fact 27712 1727096479.95351: done queuing things up, now waiting for results queue to drain 27712 1727096479.95352: waiting for pending results... 27712 1727096479.95504: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 27712 1727096479.95565: in run() - task 0afff68d-5257-cbc7-8716-0000000001d5 27712 1727096479.95582: variable 'ansible_search_path' from source: unknown 27712 1727096479.95586: variable 'ansible_search_path' from source: unknown 27712 1727096479.95613: calling self._execute() 27712 1727096479.95683: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.95687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.95697: variable 'omit' from source: magic vars 27712 1727096479.95962: variable 'ansible_distribution_major_version' from source: facts 27712 1727096479.95975: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096479.95981: variable 'omit' from source: magic vars 27712 1727096479.96015: variable 'omit' from source: magic vars 27712 1727096479.96093: variable '_current_interfaces' from source: set_fact 27712 1727096479.96142: variable 'omit' from source: magic vars 27712 1727096479.96172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096479.96201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096479.96216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096479.96230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096479.96239: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096479.96265: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096479.96270: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.96272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.96338: Set connection var ansible_connection to ssh 27712 1727096479.96344: Set connection var ansible_pipelining to False 27712 1727096479.96351: Set connection var ansible_timeout to 10 27712 1727096479.96354: Set connection var ansible_shell_type to sh 27712 1727096479.96363: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096479.96366: Set connection var ansible_shell_executable to /bin/sh 27712 1727096479.96385: variable 'ansible_shell_executable' from source: unknown 27712 1727096479.96389: variable 'ansible_connection' from source: unknown 27712 1727096479.96391: variable 'ansible_module_compression' from source: unknown 27712 1727096479.96394: variable 'ansible_shell_type' from source: unknown 27712 1727096479.96396: variable 'ansible_shell_executable' from source: unknown 27712 1727096479.96398: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.96400: variable 'ansible_pipelining' from source: unknown 27712 1727096479.96403: variable 'ansible_timeout' from source: unknown 27712 1727096479.96407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.96509: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096479.96517: variable 'omit' from source: magic vars 27712 1727096479.96523: starting attempt loop 27712 1727096479.96526: running the handler 27712 1727096479.96535: handler run complete 27712 1727096479.96543: attempt loop complete, returning result 27712 1727096479.96547: _execute() done 27712 1727096479.96550: dumping result to json 27712 1727096479.96552: done dumping result, returning 27712 1727096479.96559: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0afff68d-5257-cbc7-8716-0000000001d5] 27712 1727096479.96562: sending task result for task 0afff68d-5257-cbc7-8716-0000000001d5 27712 1727096479.96640: done sending task result for task 0afff68d-5257-cbc7-8716-0000000001d5 27712 1727096479.96643: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 27712 1727096479.96716: no more pending results, returning what we have 27712 1727096479.96719: results queue empty 27712 1727096479.96720: checking for any_errors_fatal 27712 1727096479.96726: done checking for any_errors_fatal 27712 1727096479.96727: checking for max_fail_percentage 27712 1727096479.96728: done checking for max_fail_percentage 27712 1727096479.96729: checking to see if all hosts have failed and the running result is not ok 27712 1727096479.96729: done checking to see if all hosts have failed 27712 1727096479.96730: getting the remaining hosts for this loop 27712 1727096479.96731: done getting the remaining hosts for this loop 27712 1727096479.96734: getting the next task for host managed_node2 27712 1727096479.96741: done getting next task for host managed_node2 27712 1727096479.96743: ^ task is: TASK: Show current_interfaces 27712 1727096479.96747: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096479.96750: getting variables 27712 1727096479.96752: in VariableManager get_vars() 27712 1727096479.96784: Calling all_inventory to load vars for managed_node2 27712 1727096479.96787: Calling groups_inventory to load vars for managed_node2 27712 1727096479.96789: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096479.96797: Calling all_plugins_play to load vars for managed_node2 27712 1727096479.96799: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096479.96802: Calling groups_plugins_play to load vars for managed_node2 27712 1727096479.96921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096479.97039: done with get_vars() 27712 1727096479.97047: done getting variables 27712 1727096479.97092: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 09:01:19 -0400 (0:00:00.019) 0:00:05.664 ****** 27712 1727096479.97113: entering _queue_task() for managed_node2/debug 27712 1727096479.97297: worker is 1 (out of 1 available) 27712 1727096479.97310: exiting _queue_task() for managed_node2/debug 27712 1727096479.97321: done queuing things up, now waiting for results queue to drain 27712 1727096479.97322: waiting for pending results... 27712 1727096479.97478: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 27712 1727096479.97537: in run() - task 0afff68d-5257-cbc7-8716-00000000019e 27712 1727096479.97549: variable 'ansible_search_path' from source: unknown 27712 1727096479.97553: variable 'ansible_search_path' from source: unknown 27712 1727096479.97584: calling self._execute() 27712 1727096479.97646: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.97652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.97660: variable 'omit' from source: magic vars 27712 1727096479.97922: variable 'ansible_distribution_major_version' from source: facts 27712 1727096479.97932: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096479.97937: variable 'omit' from source: magic vars 27712 1727096479.97968: variable 'omit' from source: magic vars 27712 1727096479.98038: variable 'current_interfaces' from source: set_fact 27712 1727096479.98058: variable 'omit' from source: magic vars 27712 1727096479.98093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096479.98120: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096479.98134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096479.98146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096479.98156: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096479.98180: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096479.98184: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.98186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.98256: Set connection var ansible_connection to ssh 27712 1727096479.98262: Set connection var ansible_pipelining to False 27712 1727096479.98268: Set connection var ansible_timeout to 10 27712 1727096479.98273: Set connection var ansible_shell_type to sh 27712 1727096479.98278: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096479.98283: Set connection var ansible_shell_executable to /bin/sh 27712 1727096479.98299: variable 'ansible_shell_executable' from source: unknown 27712 1727096479.98302: variable 'ansible_connection' from source: unknown 27712 1727096479.98306: variable 'ansible_module_compression' from source: unknown 27712 1727096479.98309: variable 'ansible_shell_type' from source: unknown 27712 1727096479.98311: variable 'ansible_shell_executable' from source: unknown 27712 1727096479.98313: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.98317: variable 'ansible_pipelining' from source: unknown 27712 1727096479.98319: variable 'ansible_timeout' from source: unknown 27712 1727096479.98321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.98425: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096479.98435: variable 'omit' from source: magic vars 27712 1727096479.98439: starting attempt loop 27712 1727096479.98442: running the handler 27712 1727096479.98480: handler run complete 27712 1727096479.98490: attempt loop complete, returning result 27712 1727096479.98493: _execute() done 27712 1727096479.98496: dumping result to json 27712 1727096479.98498: done dumping result, returning 27712 1727096479.98505: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0afff68d-5257-cbc7-8716-00000000019e] 27712 1727096479.98508: sending task result for task 0afff68d-5257-cbc7-8716-00000000019e 27712 1727096479.98597: done sending task result for task 0afff68d-5257-cbc7-8716-00000000019e 27712 1727096479.98600: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 27712 1727096479.98641: no more pending results, returning what we have 27712 1727096479.98644: results queue empty 27712 1727096479.98645: checking for any_errors_fatal 27712 1727096479.98650: done checking for any_errors_fatal 27712 1727096479.98651: checking for max_fail_percentage 27712 1727096479.98652: done checking for max_fail_percentage 27712 1727096479.98653: checking to see if all hosts have failed and the running result is not ok 27712 1727096479.98654: done checking to see if all hosts have failed 27712 1727096479.98655: getting the remaining hosts for this loop 27712 1727096479.98656: done getting the remaining hosts for this loop 27712 1727096479.98659: getting the next task for host managed_node2 27712 1727096479.98666: done getting next task for host managed_node2 27712 1727096479.98672: ^ task is: TASK: Install iproute 27712 1727096479.98675: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096479.98687: getting variables 27712 1727096479.98689: in VariableManager get_vars() 27712 1727096479.98721: Calling all_inventory to load vars for managed_node2 27712 1727096479.98723: Calling groups_inventory to load vars for managed_node2 27712 1727096479.98725: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096479.98733: Calling all_plugins_play to load vars for managed_node2 27712 1727096479.98735: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096479.98738: Calling groups_plugins_play to load vars for managed_node2 27712 1727096479.98884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096479.99000: done with get_vars() 27712 1727096479.99009: done getting variables 27712 1727096479.99046: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Monday 23 September 2024 09:01:19 -0400 (0:00:00.019) 0:00:05.683 ****** 27712 1727096479.99065: entering _queue_task() for managed_node2/package 27712 1727096479.99242: worker is 1 (out of 1 available) 27712 1727096479.99255: exiting _queue_task() for managed_node2/package 27712 1727096479.99265: done queuing things up, now waiting for results queue to drain 27712 1727096479.99266: waiting for pending results... 27712 1727096479.99419: running TaskExecutor() for managed_node2/TASK: Install iproute 27712 1727096479.99478: in run() - task 0afff68d-5257-cbc7-8716-00000000016d 27712 1727096479.99491: variable 'ansible_search_path' from source: unknown 27712 1727096479.99496: variable 'ansible_search_path' from source: unknown 27712 1727096479.99523: calling self._execute() 27712 1727096479.99588: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096479.99591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096479.99600: variable 'omit' from source: magic vars 27712 1727096479.99863: variable 'ansible_distribution_major_version' from source: facts 27712 1727096479.99876: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096479.99881: variable 'omit' from source: magic vars 27712 1727096479.99909: variable 'omit' from source: magic vars 27712 1727096480.00036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096480.01450: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096480.01510: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096480.01535: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096480.01559: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096480.01586: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096480.01654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096480.01680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096480.01700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096480.01729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096480.01739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096480.01819: variable '__network_is_ostree' from source: set_fact 27712 1727096480.01822: variable 'omit' from source: magic vars 27712 1727096480.01847: variable 'omit' from source: magic vars 27712 1727096480.01871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096480.01894: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096480.01911: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096480.01924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096480.01932: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096480.01953: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096480.01956: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096480.01959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096480.02032: Set connection var ansible_connection to ssh 27712 1727096480.02038: Set connection var ansible_pipelining to False 27712 1727096480.02043: Set connection var ansible_timeout to 10 27712 1727096480.02046: Set connection var ansible_shell_type to sh 27712 1727096480.02052: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096480.02057: Set connection var ansible_shell_executable to /bin/sh 27712 1727096480.02079: variable 'ansible_shell_executable' from source: unknown 27712 1727096480.02081: variable 'ansible_connection' from source: unknown 27712 1727096480.02084: variable 'ansible_module_compression' from source: unknown 27712 1727096480.02086: variable 'ansible_shell_type' from source: unknown 27712 1727096480.02088: variable 'ansible_shell_executable' from source: unknown 27712 1727096480.02090: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096480.02094: variable 'ansible_pipelining' from source: unknown 27712 1727096480.02097: variable 'ansible_timeout' from source: unknown 27712 1727096480.02103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096480.02172: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096480.02182: variable 'omit' from source: magic vars 27712 1727096480.02187: starting attempt loop 27712 1727096480.02190: running the handler 27712 1727096480.02196: variable 'ansible_facts' from source: unknown 27712 1727096480.02199: variable 'ansible_facts' from source: unknown 27712 1727096480.02233: _low_level_execute_command(): starting 27712 1727096480.02236: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096480.02743: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096480.02747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096480.02750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096480.02752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096480.02755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096480.02810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096480.02813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096480.02820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096480.02857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096480.04564: stdout chunk (state=3): >>>/root <<< 27712 1727096480.04661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096480.04695: stderr chunk (state=3): >>><<< 27712 1727096480.04698: stdout chunk (state=3): >>><<< 27712 1727096480.04721: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096480.04732: _low_level_execute_command(): starting 27712 1727096480.04738: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096480.0472147-28053-210153306735600 `" && echo ansible-tmp-1727096480.0472147-28053-210153306735600="` echo /root/.ansible/tmp/ansible-tmp-1727096480.0472147-28053-210153306735600 `" ) && sleep 0' 27712 1727096480.05197: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096480.05200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096480.05203: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096480.05205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096480.05260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096480.05263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096480.05265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096480.05307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096480.07291: stdout chunk (state=3): >>>ansible-tmp-1727096480.0472147-28053-210153306735600=/root/.ansible/tmp/ansible-tmp-1727096480.0472147-28053-210153306735600 <<< 27712 1727096480.07395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096480.07425: stderr chunk (state=3): >>><<< 27712 1727096480.07428: stdout chunk (state=3): >>><<< 27712 1727096480.07443: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096480.0472147-28053-210153306735600=/root/.ansible/tmp/ansible-tmp-1727096480.0472147-28053-210153306735600 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096480.07487: variable 'ansible_module_compression' from source: unknown 27712 1727096480.07531: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 27712 1727096480.07535: ANSIBALLZ: Acquiring lock 27712 1727096480.07537: ANSIBALLZ: Lock acquired: 140297911472480 27712 1727096480.07540: ANSIBALLZ: Creating module 27712 1727096480.19978: ANSIBALLZ: Writing module into payload 27712 1727096480.20018: ANSIBALLZ: Writing module 27712 1727096480.20053: ANSIBALLZ: Renaming module 27712 1727096480.20075: ANSIBALLZ: Done creating module 27712 1727096480.20100: variable 'ansible_facts' from source: unknown 27712 1727096480.20219: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096480.0472147-28053-210153306735600/AnsiballZ_dnf.py 27712 1727096480.20458: Sending initial data 27712 1727096480.20467: Sent initial data (152 bytes) 27712 1727096480.21085: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096480.21109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096480.21127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096480.21204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096480.22889: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 27712 1727096480.22939: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096480.23012: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096480.23058: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096480.0472147-28053-210153306735600/AnsiballZ_dnf.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpsmj0d8gp" to remote "/root/.ansible/tmp/ansible-tmp-1727096480.0472147-28053-210153306735600/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096480.0472147-28053-210153306735600/AnsiballZ_dnf.py" <<< 27712 1727096480.23139: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpsmj0d8gp /root/.ansible/tmp/ansible-tmp-1727096480.0472147-28053-210153306735600/AnsiballZ_dnf.py <<< 27712 1727096480.24010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096480.24099: stderr chunk (state=3): >>><<< 27712 1727096480.24112: stdout chunk (state=3): >>><<< 27712 1727096480.24283: done transferring module to remote 27712 1727096480.24289: _low_level_execute_command(): starting 27712 1727096480.24293: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096480.0472147-28053-210153306735600/ /root/.ansible/tmp/ansible-tmp-1727096480.0472147-28053-210153306735600/AnsiballZ_dnf.py && sleep 0' 27712 1727096480.25055: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096480.25073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096480.25212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096480.25297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096480.25326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096480.27207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096480.27211: stdout chunk (state=3): >>><<< 27712 1727096480.27214: stderr chunk (state=3): >>><<< 27712 1727096480.27443: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096480.27448: _low_level_execute_command(): starting 27712 1727096480.27452: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096480.0472147-28053-210153306735600/AnsiballZ_dnf.py && sleep 0' 27712 1727096480.28780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096480.28889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096480.28954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096480.70900: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 27712 1727096480.75641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096480.75665: stderr chunk (state=3): >>><<< 27712 1727096480.75671: stdout chunk (state=3): >>><<< 27712 1727096480.75689: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096480.75727: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096480.0472147-28053-210153306735600/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096480.75733: _low_level_execute_command(): starting 27712 1727096480.75738: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096480.0472147-28053-210153306735600/ > /dev/null 2>&1 && sleep 0' 27712 1727096480.76154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096480.76165: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096480.76193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096480.76196: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096480.76199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096480.76254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096480.76257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096480.76263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096480.76301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096480.78142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096480.78165: stderr chunk (state=3): >>><<< 27712 1727096480.78171: stdout chunk (state=3): >>><<< 27712 1727096480.78187: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096480.78193: handler run complete 27712 1727096480.78310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096480.78436: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096480.78465: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096480.78492: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096480.78514: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096480.78570: variable '__install_status' from source: unknown 27712 1727096480.78590: Evaluated conditional (__install_status is success): True 27712 1727096480.78602: attempt loop complete, returning result 27712 1727096480.78605: _execute() done 27712 1727096480.78607: dumping result to json 27712 1727096480.78613: done dumping result, returning 27712 1727096480.78619: done running TaskExecutor() for managed_node2/TASK: Install iproute [0afff68d-5257-cbc7-8716-00000000016d] 27712 1727096480.78622: sending task result for task 0afff68d-5257-cbc7-8716-00000000016d 27712 1727096480.78721: done sending task result for task 0afff68d-5257-cbc7-8716-00000000016d ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 27712 1727096480.78802: no more pending results, returning what we have 27712 1727096480.78806: results queue empty 27712 1727096480.78807: checking for any_errors_fatal 27712 1727096480.78811: done checking for any_errors_fatal 27712 1727096480.78811: checking for max_fail_percentage 27712 1727096480.78813: done checking for max_fail_percentage 27712 1727096480.78813: checking to see if all hosts have failed and the running result is not ok 27712 1727096480.78814: done checking to see if all hosts have failed 27712 1727096480.78815: getting the remaining hosts for this loop 27712 1727096480.78816: done getting the remaining hosts for this loop 27712 1727096480.78819: getting the next task for host managed_node2 27712 1727096480.78825: done getting next task for host managed_node2 27712 1727096480.78827: ^ task is: TASK: Create veth interface {{ interface }} 27712 1727096480.78830: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096480.78833: getting variables 27712 1727096480.78834: in VariableManager get_vars() 27712 1727096480.78876: Calling all_inventory to load vars for managed_node2 27712 1727096480.78879: Calling groups_inventory to load vars for managed_node2 27712 1727096480.78881: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096480.78892: Calling all_plugins_play to load vars for managed_node2 27712 1727096480.78894: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096480.78897: Calling groups_plugins_play to load vars for managed_node2 27712 1727096480.79044: WORKER PROCESS EXITING 27712 1727096480.79067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096480.79203: done with get_vars() 27712 1727096480.79211: done getting variables 27712 1727096480.79250: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096480.79347: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Monday 23 September 2024 09:01:20 -0400 (0:00:00.803) 0:00:06.487 ****** 27712 1727096480.79382: entering _queue_task() for managed_node2/command 27712 1727096480.79587: worker is 1 (out of 1 available) 27712 1727096480.79601: exiting _queue_task() for managed_node2/command 27712 1727096480.79612: done queuing things up, now waiting for results queue to drain 27712 1727096480.79613: waiting for pending results... 27712 1727096480.79761: running TaskExecutor() for managed_node2/TASK: Create veth interface ethtest0 27712 1727096480.79818: in run() - task 0afff68d-5257-cbc7-8716-00000000016e 27712 1727096480.79829: variable 'ansible_search_path' from source: unknown 27712 1727096480.79834: variable 'ansible_search_path' from source: unknown 27712 1727096480.80173: variable 'interface' from source: set_fact 27712 1727096480.80178: variable 'interface' from source: set_fact 27712 1727096480.80213: variable 'interface' from source: set_fact 27712 1727096480.80352: Loaded config def from plugin (lookup/items) 27712 1727096480.80366: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 27712 1727096480.80393: variable 'omit' from source: magic vars 27712 1727096480.80510: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096480.80525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096480.80541: variable 'omit' from source: magic vars 27712 1727096480.80824: variable 'ansible_distribution_major_version' from source: facts 27712 1727096480.80836: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096480.81015: variable 'type' from source: set_fact 27712 1727096480.81024: variable 'state' from source: include params 27712 1727096480.81031: variable 'interface' from source: set_fact 27712 1727096480.81039: variable 'current_interfaces' from source: set_fact 27712 1727096480.81049: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27712 1727096480.81059: variable 'omit' from source: magic vars 27712 1727096480.81100: variable 'omit' from source: magic vars 27712 1727096480.81145: variable 'item' from source: unknown 27712 1727096480.81219: variable 'item' from source: unknown 27712 1727096480.81239: variable 'omit' from source: magic vars 27712 1727096480.81274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096480.81309: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096480.81342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096480.81371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096480.81375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096480.81408: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096480.81411: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096480.81414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096480.81474: Set connection var ansible_connection to ssh 27712 1727096480.81480: Set connection var ansible_pipelining to False 27712 1727096480.81485: Set connection var ansible_timeout to 10 27712 1727096480.81488: Set connection var ansible_shell_type to sh 27712 1727096480.81496: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096480.81499: Set connection var ansible_shell_executable to /bin/sh 27712 1727096480.81517: variable 'ansible_shell_executable' from source: unknown 27712 1727096480.81520: variable 'ansible_connection' from source: unknown 27712 1727096480.81523: variable 'ansible_module_compression' from source: unknown 27712 1727096480.81525: variable 'ansible_shell_type' from source: unknown 27712 1727096480.81527: variable 'ansible_shell_executable' from source: unknown 27712 1727096480.81529: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096480.81531: variable 'ansible_pipelining' from source: unknown 27712 1727096480.81533: variable 'ansible_timeout' from source: unknown 27712 1727096480.81538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096480.81632: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096480.81644: variable 'omit' from source: magic vars 27712 1727096480.81649: starting attempt loop 27712 1727096480.81656: running the handler 27712 1727096480.81680: _low_level_execute_command(): starting 27712 1727096480.81687: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096480.82177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096480.82181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096480.82187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096480.82190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096480.82241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096480.82248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096480.82251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096480.82287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096480.83959: stdout chunk (state=3): >>>/root <<< 27712 1727096480.84111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096480.84118: stdout chunk (state=3): >>><<< 27712 1727096480.84126: stderr chunk (state=3): >>><<< 27712 1727096480.84144: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096480.84174: _low_level_execute_command(): starting 27712 1727096480.84178: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096480.8414354-28092-73346272695737 `" && echo ansible-tmp-1727096480.8414354-28092-73346272695737="` echo /root/.ansible/tmp/ansible-tmp-1727096480.8414354-28092-73346272695737 `" ) && sleep 0' 27712 1727096480.84878: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096480.84884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096480.84909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096480.86847: stdout chunk (state=3): >>>ansible-tmp-1727096480.8414354-28092-73346272695737=/root/.ansible/tmp/ansible-tmp-1727096480.8414354-28092-73346272695737 <<< 27712 1727096480.86959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096480.86985: stderr chunk (state=3): >>><<< 27712 1727096480.86989: stdout chunk (state=3): >>><<< 27712 1727096480.87005: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096480.8414354-28092-73346272695737=/root/.ansible/tmp/ansible-tmp-1727096480.8414354-28092-73346272695737 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096480.87030: variable 'ansible_module_compression' from source: unknown 27712 1727096480.87075: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096480.87102: variable 'ansible_facts' from source: unknown 27712 1727096480.87156: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096480.8414354-28092-73346272695737/AnsiballZ_command.py 27712 1727096480.87256: Sending initial data 27712 1727096480.87259: Sent initial data (155 bytes) 27712 1727096480.88038: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096480.88248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096480.88253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096480.88393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096480.89981: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096480.90013: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096480.90045: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmp4xl2lmmq /root/.ansible/tmp/ansible-tmp-1727096480.8414354-28092-73346272695737/AnsiballZ_command.py <<< 27712 1727096480.90057: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096480.8414354-28092-73346272695737/AnsiballZ_command.py" <<< 27712 1727096480.90077: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmp4xl2lmmq" to remote "/root/.ansible/tmp/ansible-tmp-1727096480.8414354-28092-73346272695737/AnsiballZ_command.py" <<< 27712 1727096480.90086: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096480.8414354-28092-73346272695737/AnsiballZ_command.py" <<< 27712 1727096480.90564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096480.90695: stderr chunk (state=3): >>><<< 27712 1727096480.90702: stdout chunk (state=3): >>><<< 27712 1727096480.90705: done transferring module to remote 27712 1727096480.90707: _low_level_execute_command(): starting 27712 1727096480.90710: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096480.8414354-28092-73346272695737/ /root/.ansible/tmp/ansible-tmp-1727096480.8414354-28092-73346272695737/AnsiballZ_command.py && sleep 0' 27712 1727096480.91483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096480.91486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096480.91488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096480.91494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096480.91496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096480.91498: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096480.91500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096480.91502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096480.91503: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096480.91506: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096480.91508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096480.91509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096480.91653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096480.93331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096480.93380: stderr chunk (state=3): >>><<< 27712 1727096480.93389: stdout chunk (state=3): >>><<< 27712 1727096480.93411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096480.93414: _low_level_execute_command(): starting 27712 1727096480.93422: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096480.8414354-28092-73346272695737/AnsiballZ_command.py && sleep 0' 27712 1727096480.94021: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096480.94030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096480.94041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096480.94055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096480.94072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096480.94077: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096480.94249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096480.94253: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096480.94256: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096480.94258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096480.94297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096481.10990: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-23 09:01:21.098089", "end": "2024-09-23 09:01:21.104375", "delta": "0:00:00.006286", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096481.13593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096481.13597: stdout chunk (state=3): >>><<< 27712 1727096481.13602: stderr chunk (state=3): >>><<< 27712 1727096481.13629: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-23 09:01:21.098089", "end": "2024-09-23 09:01:21.104375", "delta": "0:00:00.006286", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096481.13662: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096480.8414354-28092-73346272695737/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096481.13673: _low_level_execute_command(): starting 27712 1727096481.13681: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096480.8414354-28092-73346272695737/ > /dev/null 2>&1 && sleep 0' 27712 1727096481.15209: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096481.15674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096481.15795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096481.16300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096481.19085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096481.19089: stdout chunk (state=3): >>><<< 27712 1727096481.19097: stderr chunk (state=3): >>><<< 27712 1727096481.19116: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096481.19122: handler run complete 27712 1727096481.19148: Evaluated conditional (False): False 27712 1727096481.19162: attempt loop complete, returning result 27712 1727096481.19185: variable 'item' from source: unknown 27712 1727096481.19263: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.006286", "end": "2024-09-23 09:01:21.104375", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-23 09:01:21.098089" } 27712 1727096481.19732: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096481.19735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096481.19738: variable 'omit' from source: magic vars 27712 1727096481.19964: variable 'ansible_distribution_major_version' from source: facts 27712 1727096481.19969: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096481.20378: variable 'type' from source: set_fact 27712 1727096481.20383: variable 'state' from source: include params 27712 1727096481.20389: variable 'interface' from source: set_fact 27712 1727096481.20440: variable 'current_interfaces' from source: set_fact 27712 1727096481.20624: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27712 1727096481.20627: variable 'omit' from source: magic vars 27712 1727096481.20630: variable 'omit' from source: magic vars 27712 1727096481.20632: variable 'item' from source: unknown 27712 1727096481.20678: variable 'item' from source: unknown 27712 1727096481.21074: variable 'omit' from source: magic vars 27712 1727096481.21083: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096481.21086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096481.21088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096481.21090: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096481.21093: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096481.21095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096481.21173: Set connection var ansible_connection to ssh 27712 1727096481.21176: Set connection var ansible_pipelining to False 27712 1727096481.21178: Set connection var ansible_timeout to 10 27712 1727096481.21180: Set connection var ansible_shell_type to sh 27712 1727096481.21182: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096481.21188: Set connection var ansible_shell_executable to /bin/sh 27712 1727096481.21191: variable 'ansible_shell_executable' from source: unknown 27712 1727096481.21193: variable 'ansible_connection' from source: unknown 27712 1727096481.21195: variable 'ansible_module_compression' from source: unknown 27712 1727096481.21197: variable 'ansible_shell_type' from source: unknown 27712 1727096481.21200: variable 'ansible_shell_executable' from source: unknown 27712 1727096481.21204: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096481.21209: variable 'ansible_pipelining' from source: unknown 27712 1727096481.21211: variable 'ansible_timeout' from source: unknown 27712 1727096481.21213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096481.21576: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096481.21580: variable 'omit' from source: magic vars 27712 1727096481.21582: starting attempt loop 27712 1727096481.21584: running the handler 27712 1727096481.21586: _low_level_execute_command(): starting 27712 1727096481.21588: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096481.22670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096481.22683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096481.22693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096481.22707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096481.22719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096481.22729: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096481.22734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.22748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096481.22755: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096481.22845: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27712 1727096481.22848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096481.22851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096481.22853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096481.22855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096481.22879: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096481.22887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096481.23092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096481.23140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096481.24785: stdout chunk (state=3): >>>/root <<< 27712 1727096481.24908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096481.24911: stdout chunk (state=3): >>><<< 27712 1727096481.25002: stderr chunk (state=3): >>><<< 27712 1727096481.25006: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096481.25008: _low_level_execute_command(): starting 27712 1727096481.25010: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096481.2493267-28092-60323760736087 `" && echo ansible-tmp-1727096481.2493267-28092-60323760736087="` echo /root/.ansible/tmp/ansible-tmp-1727096481.2493267-28092-60323760736087 `" ) && sleep 0' 27712 1727096481.26633: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096481.26647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.26658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.26933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096481.26945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096481.27185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096481.27284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096481.29129: stdout chunk (state=3): >>>ansible-tmp-1727096481.2493267-28092-60323760736087=/root/.ansible/tmp/ansible-tmp-1727096481.2493267-28092-60323760736087 <<< 27712 1727096481.29231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096481.29263: stderr chunk (state=3): >>><<< 27712 1727096481.29280: stdout chunk (state=3): >>><<< 27712 1727096481.29321: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096481.2493267-28092-60323760736087=/root/.ansible/tmp/ansible-tmp-1727096481.2493267-28092-60323760736087 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096481.29399: variable 'ansible_module_compression' from source: unknown 27712 1727096481.29452: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096481.29535: variable 'ansible_facts' from source: unknown 27712 1727096481.29743: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096481.2493267-28092-60323760736087/AnsiballZ_command.py 27712 1727096481.30875: Sending initial data 27712 1727096481.30878: Sent initial data (155 bytes) 27712 1727096481.31586: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096481.31589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.31784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.32086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096481.32185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096481.33868: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096481.33917: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096481.33956: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmp4wl17tg_ /root/.ansible/tmp/ansible-tmp-1727096481.2493267-28092-60323760736087/AnsiballZ_command.py <<< 27712 1727096481.33966: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096481.2493267-28092-60323760736087/AnsiballZ_command.py" <<< 27712 1727096481.34003: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmp4wl17tg_" to remote "/root/.ansible/tmp/ansible-tmp-1727096481.2493267-28092-60323760736087/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096481.2493267-28092-60323760736087/AnsiballZ_command.py" <<< 27712 1727096481.36176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096481.36180: stderr chunk (state=3): >>><<< 27712 1727096481.36183: stdout chunk (state=3): >>><<< 27712 1727096481.36185: done transferring module to remote 27712 1727096481.36187: _low_level_execute_command(): starting 27712 1727096481.36189: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096481.2493267-28092-60323760736087/ /root/.ansible/tmp/ansible-tmp-1727096481.2493267-28092-60323760736087/AnsiballZ_command.py && sleep 0' 27712 1727096481.38196: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096481.38331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096481.38364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096481.40309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096481.40319: stdout chunk (state=3): >>><<< 27712 1727096481.40331: stderr chunk (state=3): >>><<< 27712 1727096481.40351: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096481.40554: _low_level_execute_command(): starting 27712 1727096481.40557: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096481.2493267-28092-60323760736087/AnsiballZ_command.py && sleep 0' 27712 1727096481.41652: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096481.41656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096481.41658: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.41660: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096481.41662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096481.41664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.41901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096481.57776: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-23 09:01:21.572991", "end": "2024-09-23 09:01:21.576703", "delta": "0:00:00.003712", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096481.59380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096481.59384: stdout chunk (state=3): >>><<< 27712 1727096481.59386: stderr chunk (state=3): >>><<< 27712 1727096481.59509: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-23 09:01:21.572991", "end": "2024-09-23 09:01:21.576703", "delta": "0:00:00.003712", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096481.59517: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096481.2493267-28092-60323760736087/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096481.59520: _low_level_execute_command(): starting 27712 1727096481.59522: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096481.2493267-28092-60323760736087/ > /dev/null 2>&1 && sleep 0' 27712 1727096481.59898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096481.59912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096481.59925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.59973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096481.59995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096481.60022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096481.61872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096481.61899: stderr chunk (state=3): >>><<< 27712 1727096481.61902: stdout chunk (state=3): >>><<< 27712 1727096481.61915: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096481.61921: handler run complete 27712 1727096481.61941: Evaluated conditional (False): False 27712 1727096481.61949: attempt loop complete, returning result 27712 1727096481.61964: variable 'item' from source: unknown 27712 1727096481.62025: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003712", "end": "2024-09-23 09:01:21.576703", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-23 09:01:21.572991" } 27712 1727096481.62137: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096481.62140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096481.62142: variable 'omit' from source: magic vars 27712 1727096481.62250: variable 'ansible_distribution_major_version' from source: facts 27712 1727096481.62256: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096481.62378: variable 'type' from source: set_fact 27712 1727096481.62382: variable 'state' from source: include params 27712 1727096481.62386: variable 'interface' from source: set_fact 27712 1727096481.62390: variable 'current_interfaces' from source: set_fact 27712 1727096481.62396: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27712 1727096481.62400: variable 'omit' from source: magic vars 27712 1727096481.62411: variable 'omit' from source: magic vars 27712 1727096481.62436: variable 'item' from source: unknown 27712 1727096481.62487: variable 'item' from source: unknown 27712 1727096481.62498: variable 'omit' from source: magic vars 27712 1727096481.62514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096481.62522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096481.62527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096481.62537: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096481.62540: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096481.62542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096481.62595: Set connection var ansible_connection to ssh 27712 1727096481.62601: Set connection var ansible_pipelining to False 27712 1727096481.62606: Set connection var ansible_timeout to 10 27712 1727096481.62609: Set connection var ansible_shell_type to sh 27712 1727096481.62615: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096481.62619: Set connection var ansible_shell_executable to /bin/sh 27712 1727096481.62634: variable 'ansible_shell_executable' from source: unknown 27712 1727096481.62636: variable 'ansible_connection' from source: unknown 27712 1727096481.62639: variable 'ansible_module_compression' from source: unknown 27712 1727096481.62641: variable 'ansible_shell_type' from source: unknown 27712 1727096481.62643: variable 'ansible_shell_executable' from source: unknown 27712 1727096481.62645: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096481.62651: variable 'ansible_pipelining' from source: unknown 27712 1727096481.62653: variable 'ansible_timeout' from source: unknown 27712 1727096481.62657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096481.62726: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096481.62735: variable 'omit' from source: magic vars 27712 1727096481.62737: starting attempt loop 27712 1727096481.62740: running the handler 27712 1727096481.62745: _low_level_execute_command(): starting 27712 1727096481.62748: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096481.63205: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096481.63208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.63211: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096481.63213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096481.63215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.63266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096481.63278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096481.63314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096481.64940: stdout chunk (state=3): >>>/root <<< 27712 1727096481.65035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096481.65067: stderr chunk (state=3): >>><<< 27712 1727096481.65075: stdout chunk (state=3): >>><<< 27712 1727096481.65090: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096481.65098: _low_level_execute_command(): starting 27712 1727096481.65103: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096481.6508944-28092-105465144517980 `" && echo ansible-tmp-1727096481.6508944-28092-105465144517980="` echo /root/.ansible/tmp/ansible-tmp-1727096481.6508944-28092-105465144517980 `" ) && sleep 0' 27712 1727096481.65558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096481.65561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096481.65564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.65566: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096481.65572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096481.65574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.65623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096481.65631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096481.65635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096481.65664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096481.67623: stdout chunk (state=3): >>>ansible-tmp-1727096481.6508944-28092-105465144517980=/root/.ansible/tmp/ansible-tmp-1727096481.6508944-28092-105465144517980 <<< 27712 1727096481.67724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096481.67752: stderr chunk (state=3): >>><<< 27712 1727096481.67756: stdout chunk (state=3): >>><<< 27712 1727096481.67771: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096481.6508944-28092-105465144517980=/root/.ansible/tmp/ansible-tmp-1727096481.6508944-28092-105465144517980 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096481.67793: variable 'ansible_module_compression' from source: unknown 27712 1727096481.67822: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096481.67836: variable 'ansible_facts' from source: unknown 27712 1727096481.67890: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096481.6508944-28092-105465144517980/AnsiballZ_command.py 27712 1727096481.67978: Sending initial data 27712 1727096481.67982: Sent initial data (156 bytes) 27712 1727096481.68436: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096481.68439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096481.68445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.68448: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096481.68450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.68501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096481.68504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096481.68508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096481.68541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096481.70117: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096481.70146: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096481.70186: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpgyuififp /root/.ansible/tmp/ansible-tmp-1727096481.6508944-28092-105465144517980/AnsiballZ_command.py <<< 27712 1727096481.70195: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096481.6508944-28092-105465144517980/AnsiballZ_command.py" <<< 27712 1727096481.70215: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpgyuififp" to remote "/root/.ansible/tmp/ansible-tmp-1727096481.6508944-28092-105465144517980/AnsiballZ_command.py" <<< 27712 1727096481.70217: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096481.6508944-28092-105465144517980/AnsiballZ_command.py" <<< 27712 1727096481.70703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096481.70746: stderr chunk (state=3): >>><<< 27712 1727096481.70750: stdout chunk (state=3): >>><<< 27712 1727096481.70765: done transferring module to remote 27712 1727096481.70828: _low_level_execute_command(): starting 27712 1727096481.70831: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096481.6508944-28092-105465144517980/ /root/.ansible/tmp/ansible-tmp-1727096481.6508944-28092-105465144517980/AnsiballZ_command.py && sleep 0' 27712 1727096481.71221: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096481.71224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096481.71226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.71228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096481.71230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096481.71232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.71277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096481.71299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096481.71324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096481.73113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096481.73134: stderr chunk (state=3): >>><<< 27712 1727096481.73137: stdout chunk (state=3): >>><<< 27712 1727096481.73150: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096481.73153: _low_level_execute_command(): starting 27712 1727096481.73158: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096481.6508944-28092-105465144517980/AnsiballZ_command.py && sleep 0' 27712 1727096481.73786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096481.73801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096481.73891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096481.73906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096481.73937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096481.74005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096481.89824: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-23 09:01:21.891623", "end": "2024-09-23 09:01:21.895538", "delta": "0:00:00.003915", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096481.91464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096481.91475: stdout chunk (state=3): >>><<< 27712 1727096481.91478: stderr chunk (state=3): >>><<< 27712 1727096481.91480: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-23 09:01:21.891623", "end": "2024-09-23 09:01:21.895538", "delta": "0:00:00.003915", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096481.91482: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096481.6508944-28092-105465144517980/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096481.91484: _low_level_execute_command(): starting 27712 1727096481.91486: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096481.6508944-28092-105465144517980/ > /dev/null 2>&1 && sleep 0' 27712 1727096481.92041: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096481.92284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096481.92598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096481.94455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096481.94465: stdout chunk (state=3): >>><<< 27712 1727096481.94479: stderr chunk (state=3): >>><<< 27712 1727096481.94498: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096481.94508: handler run complete 27712 1727096481.94531: Evaluated conditional (False): False 27712 1727096481.94587: attempt loop complete, returning result 27712 1727096481.94610: variable 'item' from source: unknown 27712 1727096481.94845: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.003915", "end": "2024-09-23 09:01:21.895538", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-23 09:01:21.891623" } 27712 1727096481.95375: dumping result to json 27712 1727096481.95378: done dumping result, returning 27712 1727096481.95380: done running TaskExecutor() for managed_node2/TASK: Create veth interface ethtest0 [0afff68d-5257-cbc7-8716-00000000016e] 27712 1727096481.95381: sending task result for task 0afff68d-5257-cbc7-8716-00000000016e 27712 1727096481.95423: done sending task result for task 0afff68d-5257-cbc7-8716-00000000016e 27712 1727096481.95426: WORKER PROCESS EXITING 27712 1727096481.95691: no more pending results, returning what we have 27712 1727096481.95694: results queue empty 27712 1727096481.95695: checking for any_errors_fatal 27712 1727096481.95699: done checking for any_errors_fatal 27712 1727096481.95704: checking for max_fail_percentage 27712 1727096481.95705: done checking for max_fail_percentage 27712 1727096481.95706: checking to see if all hosts have failed and the running result is not ok 27712 1727096481.95707: done checking to see if all hosts have failed 27712 1727096481.95708: getting the remaining hosts for this loop 27712 1727096481.95709: done getting the remaining hosts for this loop 27712 1727096481.95712: getting the next task for host managed_node2 27712 1727096481.95716: done getting next task for host managed_node2 27712 1727096481.95718: ^ task is: TASK: Set up veth as managed by NetworkManager 27712 1727096481.95721: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096481.95725: getting variables 27712 1727096481.95726: in VariableManager get_vars() 27712 1727096481.95754: Calling all_inventory to load vars for managed_node2 27712 1727096481.95757: Calling groups_inventory to load vars for managed_node2 27712 1727096481.95759: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096481.95772: Calling all_plugins_play to load vars for managed_node2 27712 1727096481.95775: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096481.95778: Calling groups_plugins_play to load vars for managed_node2 27712 1727096481.96043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096481.96478: done with get_vars() 27712 1727096481.96488: done getting variables 27712 1727096481.96595: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Monday 23 September 2024 09:01:21 -0400 (0:00:01.172) 0:00:07.659 ****** 27712 1727096481.96622: entering _queue_task() for managed_node2/command 27712 1727096481.97229: worker is 1 (out of 1 available) 27712 1727096481.97242: exiting _queue_task() for managed_node2/command 27712 1727096481.97254: done queuing things up, now waiting for results queue to drain 27712 1727096481.97255: waiting for pending results... 27712 1727096481.97624: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 27712 1727096481.97700: in run() - task 0afff68d-5257-cbc7-8716-00000000016f 27712 1727096481.97973: variable 'ansible_search_path' from source: unknown 27712 1727096481.97977: variable 'ansible_search_path' from source: unknown 27712 1727096481.97980: calling self._execute() 27712 1727096481.98025: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096481.98038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096481.98472: variable 'omit' from source: magic vars 27712 1727096481.98642: variable 'ansible_distribution_major_version' from source: facts 27712 1727096481.98887: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096481.99043: variable 'type' from source: set_fact 27712 1727096481.99473: variable 'state' from source: include params 27712 1727096481.99478: Evaluated conditional (type == 'veth' and state == 'present'): True 27712 1727096481.99481: variable 'omit' from source: magic vars 27712 1727096481.99483: variable 'omit' from source: magic vars 27712 1727096481.99486: variable 'interface' from source: set_fact 27712 1727096481.99489: variable 'omit' from source: magic vars 27712 1727096481.99491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096481.99708: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096481.99734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096481.99759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096481.99780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096481.99816: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096481.99825: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.00173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.00372: Set connection var ansible_connection to ssh 27712 1727096482.00375: Set connection var ansible_pipelining to False 27712 1727096482.00377: Set connection var ansible_timeout to 10 27712 1727096482.00379: Set connection var ansible_shell_type to sh 27712 1727096482.00382: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096482.00384: Set connection var ansible_shell_executable to /bin/sh 27712 1727096482.00386: variable 'ansible_shell_executable' from source: unknown 27712 1727096482.00388: variable 'ansible_connection' from source: unknown 27712 1727096482.00390: variable 'ansible_module_compression' from source: unknown 27712 1727096482.00393: variable 'ansible_shell_type' from source: unknown 27712 1727096482.00394: variable 'ansible_shell_executable' from source: unknown 27712 1727096482.00396: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.00399: variable 'ansible_pipelining' from source: unknown 27712 1727096482.00401: variable 'ansible_timeout' from source: unknown 27712 1727096482.00403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.00632: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096482.00650: variable 'omit' from source: magic vars 27712 1727096482.00659: starting attempt loop 27712 1727096482.00665: running the handler 27712 1727096482.00685: _low_level_execute_command(): starting 27712 1727096482.00696: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096482.02228: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096482.02247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096482.02264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096482.02361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096482.02485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096482.02498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096482.04153: stdout chunk (state=3): >>>/root <<< 27712 1727096482.04254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096482.04296: stderr chunk (state=3): >>><<< 27712 1727096482.04325: stdout chunk (state=3): >>><<< 27712 1727096482.04444: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096482.04465: _low_level_execute_command(): starting 27712 1727096482.04479: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096482.0445092-28165-123801812976750 `" && echo ansible-tmp-1727096482.0445092-28165-123801812976750="` echo /root/.ansible/tmp/ansible-tmp-1727096482.0445092-28165-123801812976750 `" ) && sleep 0' 27712 1727096482.05694: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096482.05699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096482.05709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096482.05712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096482.05893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096482.07783: stdout chunk (state=3): >>>ansible-tmp-1727096482.0445092-28165-123801812976750=/root/.ansible/tmp/ansible-tmp-1727096482.0445092-28165-123801812976750 <<< 27712 1727096482.07887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096482.07922: stderr chunk (state=3): >>><<< 27712 1727096482.07924: stdout chunk (state=3): >>><<< 27712 1727096482.07936: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096482.0445092-28165-123801812976750=/root/.ansible/tmp/ansible-tmp-1727096482.0445092-28165-123801812976750 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096482.07977: variable 'ansible_module_compression' from source: unknown 27712 1727096482.08013: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096482.08075: variable 'ansible_facts' from source: unknown 27712 1727096482.08121: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096482.0445092-28165-123801812976750/AnsiballZ_command.py 27712 1727096482.08334: Sending initial data 27712 1727096482.08337: Sent initial data (156 bytes) 27712 1727096482.08992: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096482.09048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096482.09075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096482.09127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096482.10689: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096482.10721: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096482.10758: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpal74ij5s /root/.ansible/tmp/ansible-tmp-1727096482.0445092-28165-123801812976750/AnsiballZ_command.py <<< 27712 1727096482.10761: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096482.0445092-28165-123801812976750/AnsiballZ_command.py" <<< 27712 1727096482.10784: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpal74ij5s" to remote "/root/.ansible/tmp/ansible-tmp-1727096482.0445092-28165-123801812976750/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096482.0445092-28165-123801812976750/AnsiballZ_command.py" <<< 27712 1727096482.11502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096482.11506: stdout chunk (state=3): >>><<< 27712 1727096482.11508: stderr chunk (state=3): >>><<< 27712 1727096482.11510: done transferring module to remote 27712 1727096482.11512: _low_level_execute_command(): starting 27712 1727096482.11514: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096482.0445092-28165-123801812976750/ /root/.ansible/tmp/ansible-tmp-1727096482.0445092-28165-123801812976750/AnsiballZ_command.py && sleep 0' 27712 1727096482.12083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096482.12086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096482.12127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096482.12142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096482.12188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096482.12280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096482.14111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096482.14121: stdout chunk (state=3): >>><<< 27712 1727096482.14124: stderr chunk (state=3): >>><<< 27712 1727096482.14148: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096482.14175: _low_level_execute_command(): starting 27712 1727096482.14180: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096482.0445092-28165-123801812976750/AnsiballZ_command.py && sleep 0' 27712 1727096482.14779: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096482.14804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096482.14860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096482.31891: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-23 09:01:22.299079", "end": "2024-09-23 09:01:22.317828", "delta": "0:00:00.018749", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096482.33518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096482.33548: stderr chunk (state=3): >>><<< 27712 1727096482.33551: stdout chunk (state=3): >>><<< 27712 1727096482.33572: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-23 09:01:22.299079", "end": "2024-09-23 09:01:22.317828", "delta": "0:00:00.018749", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096482.33601: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096482.0445092-28165-123801812976750/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096482.33608: _low_level_execute_command(): starting 27712 1727096482.33613: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096482.0445092-28165-123801812976750/ > /dev/null 2>&1 && sleep 0' 27712 1727096482.34073: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096482.34078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096482.34080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096482.34083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096482.34085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096482.34133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096482.34136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096482.34142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096482.34181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096482.36087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096482.36111: stderr chunk (state=3): >>><<< 27712 1727096482.36114: stdout chunk (state=3): >>><<< 27712 1727096482.36129: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096482.36142: handler run complete 27712 1727096482.36156: Evaluated conditional (False): False 27712 1727096482.36164: attempt loop complete, returning result 27712 1727096482.36169: _execute() done 27712 1727096482.36175: dumping result to json 27712 1727096482.36181: done dumping result, returning 27712 1727096482.36189: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [0afff68d-5257-cbc7-8716-00000000016f] 27712 1727096482.36193: sending task result for task 0afff68d-5257-cbc7-8716-00000000016f 27712 1727096482.36290: done sending task result for task 0afff68d-5257-cbc7-8716-00000000016f 27712 1727096482.36293: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.018749", "end": "2024-09-23 09:01:22.317828", "rc": 0, "start": "2024-09-23 09:01:22.299079" } 27712 1727096482.36356: no more pending results, returning what we have 27712 1727096482.36359: results queue empty 27712 1727096482.36360: checking for any_errors_fatal 27712 1727096482.36376: done checking for any_errors_fatal 27712 1727096482.36377: checking for max_fail_percentage 27712 1727096482.36379: done checking for max_fail_percentage 27712 1727096482.36379: checking to see if all hosts have failed and the running result is not ok 27712 1727096482.36380: done checking to see if all hosts have failed 27712 1727096482.36381: getting the remaining hosts for this loop 27712 1727096482.36382: done getting the remaining hosts for this loop 27712 1727096482.36386: getting the next task for host managed_node2 27712 1727096482.36392: done getting next task for host managed_node2 27712 1727096482.36394: ^ task is: TASK: Delete veth interface {{ interface }} 27712 1727096482.36397: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096482.36403: getting variables 27712 1727096482.36404: in VariableManager get_vars() 27712 1727096482.36443: Calling all_inventory to load vars for managed_node2 27712 1727096482.36446: Calling groups_inventory to load vars for managed_node2 27712 1727096482.36448: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.36458: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.36460: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.36463: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.36609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.36754: done with get_vars() 27712 1727096482.36762: done getting variables 27712 1727096482.36808: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096482.36897: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Monday 23 September 2024 09:01:22 -0400 (0:00:00.402) 0:00:08.062 ****** 27712 1727096482.36922: entering _queue_task() for managed_node2/command 27712 1727096482.37148: worker is 1 (out of 1 available) 27712 1727096482.37163: exiting _queue_task() for managed_node2/command 27712 1727096482.37175: done queuing things up, now waiting for results queue to drain 27712 1727096482.37176: waiting for pending results... 27712 1727096482.37328: running TaskExecutor() for managed_node2/TASK: Delete veth interface ethtest0 27712 1727096482.37398: in run() - task 0afff68d-5257-cbc7-8716-000000000170 27712 1727096482.37411: variable 'ansible_search_path' from source: unknown 27712 1727096482.37415: variable 'ansible_search_path' from source: unknown 27712 1727096482.37441: calling self._execute() 27712 1727096482.37510: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.37514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.37525: variable 'omit' from source: magic vars 27712 1727096482.37782: variable 'ansible_distribution_major_version' from source: facts 27712 1727096482.37792: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096482.37949: variable 'type' from source: set_fact 27712 1727096482.37954: variable 'state' from source: include params 27712 1727096482.37957: variable 'interface' from source: set_fact 27712 1727096482.37960: variable 'current_interfaces' from source: set_fact 27712 1727096482.37962: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 27712 1727096482.37965: when evaluation is False, skipping this task 27712 1727096482.37966: _execute() done 27712 1727096482.37970: dumping result to json 27712 1727096482.37972: done dumping result, returning 27712 1727096482.37983: done running TaskExecutor() for managed_node2/TASK: Delete veth interface ethtest0 [0afff68d-5257-cbc7-8716-000000000170] 27712 1727096482.37985: sending task result for task 0afff68d-5257-cbc7-8716-000000000170 27712 1727096482.38058: done sending task result for task 0afff68d-5257-cbc7-8716-000000000170 27712 1727096482.38062: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27712 1727096482.38132: no more pending results, returning what we have 27712 1727096482.38136: results queue empty 27712 1727096482.38138: checking for any_errors_fatal 27712 1727096482.38146: done checking for any_errors_fatal 27712 1727096482.38147: checking for max_fail_percentage 27712 1727096482.38148: done checking for max_fail_percentage 27712 1727096482.38149: checking to see if all hosts have failed and the running result is not ok 27712 1727096482.38150: done checking to see if all hosts have failed 27712 1727096482.38151: getting the remaining hosts for this loop 27712 1727096482.38152: done getting the remaining hosts for this loop 27712 1727096482.38155: getting the next task for host managed_node2 27712 1727096482.38160: done getting next task for host managed_node2 27712 1727096482.38162: ^ task is: TASK: Create dummy interface {{ interface }} 27712 1727096482.38165: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096482.38170: getting variables 27712 1727096482.38172: in VariableManager get_vars() 27712 1727096482.38209: Calling all_inventory to load vars for managed_node2 27712 1727096482.38211: Calling groups_inventory to load vars for managed_node2 27712 1727096482.38213: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.38223: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.38225: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.38227: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.38351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.38475: done with get_vars() 27712 1727096482.38483: done getting variables 27712 1727096482.38524: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096482.38607: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Monday 23 September 2024 09:01:22 -0400 (0:00:00.017) 0:00:08.079 ****** 27712 1727096482.38627: entering _queue_task() for managed_node2/command 27712 1727096482.38826: worker is 1 (out of 1 available) 27712 1727096482.38839: exiting _queue_task() for managed_node2/command 27712 1727096482.38851: done queuing things up, now waiting for results queue to drain 27712 1727096482.38852: waiting for pending results... 27712 1727096482.39005: running TaskExecutor() for managed_node2/TASK: Create dummy interface ethtest0 27712 1727096482.39060: in run() - task 0afff68d-5257-cbc7-8716-000000000171 27712 1727096482.39077: variable 'ansible_search_path' from source: unknown 27712 1727096482.39082: variable 'ansible_search_path' from source: unknown 27712 1727096482.39107: calling self._execute() 27712 1727096482.39174: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.39179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.39185: variable 'omit' from source: magic vars 27712 1727096482.39437: variable 'ansible_distribution_major_version' from source: facts 27712 1727096482.39446: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096482.39577: variable 'type' from source: set_fact 27712 1727096482.39580: variable 'state' from source: include params 27712 1727096482.39583: variable 'interface' from source: set_fact 27712 1727096482.39586: variable 'current_interfaces' from source: set_fact 27712 1727096482.39593: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 27712 1727096482.39596: when evaluation is False, skipping this task 27712 1727096482.39599: _execute() done 27712 1727096482.39602: dumping result to json 27712 1727096482.39604: done dumping result, returning 27712 1727096482.39610: done running TaskExecutor() for managed_node2/TASK: Create dummy interface ethtest0 [0afff68d-5257-cbc7-8716-000000000171] 27712 1727096482.39616: sending task result for task 0afff68d-5257-cbc7-8716-000000000171 27712 1727096482.39692: done sending task result for task 0afff68d-5257-cbc7-8716-000000000171 27712 1727096482.39695: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 27712 1727096482.39763: no more pending results, returning what we have 27712 1727096482.39766: results queue empty 27712 1727096482.39767: checking for any_errors_fatal 27712 1727096482.39774: done checking for any_errors_fatal 27712 1727096482.39775: checking for max_fail_percentage 27712 1727096482.39777: done checking for max_fail_percentage 27712 1727096482.39777: checking to see if all hosts have failed and the running result is not ok 27712 1727096482.39778: done checking to see if all hosts have failed 27712 1727096482.39779: getting the remaining hosts for this loop 27712 1727096482.39780: done getting the remaining hosts for this loop 27712 1727096482.39783: getting the next task for host managed_node2 27712 1727096482.39787: done getting next task for host managed_node2 27712 1727096482.39789: ^ task is: TASK: Delete dummy interface {{ interface }} 27712 1727096482.39791: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096482.39795: getting variables 27712 1727096482.39796: in VariableManager get_vars() 27712 1727096482.39825: Calling all_inventory to load vars for managed_node2 27712 1727096482.39827: Calling groups_inventory to load vars for managed_node2 27712 1727096482.39828: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.39835: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.39836: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.39838: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.39981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.40098: done with get_vars() 27712 1727096482.40105: done getting variables 27712 1727096482.40142: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096482.40217: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Monday 23 September 2024 09:01:22 -0400 (0:00:00.016) 0:00:08.095 ****** 27712 1727096482.40236: entering _queue_task() for managed_node2/command 27712 1727096482.40410: worker is 1 (out of 1 available) 27712 1727096482.40421: exiting _queue_task() for managed_node2/command 27712 1727096482.40433: done queuing things up, now waiting for results queue to drain 27712 1727096482.40434: waiting for pending results... 27712 1727096482.40788: running TaskExecutor() for managed_node2/TASK: Delete dummy interface ethtest0 27712 1727096482.40794: in run() - task 0afff68d-5257-cbc7-8716-000000000172 27712 1727096482.40797: variable 'ansible_search_path' from source: unknown 27712 1727096482.40800: variable 'ansible_search_path' from source: unknown 27712 1727096482.40802: calling self._execute() 27712 1727096482.40834: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.40845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.40859: variable 'omit' from source: magic vars 27712 1727096482.41223: variable 'ansible_distribution_major_version' from source: facts 27712 1727096482.41249: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096482.41463: variable 'type' from source: set_fact 27712 1727096482.41482: variable 'state' from source: include params 27712 1727096482.41493: variable 'interface' from source: set_fact 27712 1727096482.41577: variable 'current_interfaces' from source: set_fact 27712 1727096482.41585: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 27712 1727096482.41588: when evaluation is False, skipping this task 27712 1727096482.41590: _execute() done 27712 1727096482.41593: dumping result to json 27712 1727096482.41596: done dumping result, returning 27712 1727096482.41598: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface ethtest0 [0afff68d-5257-cbc7-8716-000000000172] 27712 1727096482.41599: sending task result for task 0afff68d-5257-cbc7-8716-000000000172 27712 1727096482.41661: done sending task result for task 0afff68d-5257-cbc7-8716-000000000172 27712 1727096482.41665: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27712 1727096482.41737: no more pending results, returning what we have 27712 1727096482.41741: results queue empty 27712 1727096482.41742: checking for any_errors_fatal 27712 1727096482.41748: done checking for any_errors_fatal 27712 1727096482.41748: checking for max_fail_percentage 27712 1727096482.41750: done checking for max_fail_percentage 27712 1727096482.41751: checking to see if all hosts have failed and the running result is not ok 27712 1727096482.41752: done checking to see if all hosts have failed 27712 1727096482.41753: getting the remaining hosts for this loop 27712 1727096482.41754: done getting the remaining hosts for this loop 27712 1727096482.41758: getting the next task for host managed_node2 27712 1727096482.41764: done getting next task for host managed_node2 27712 1727096482.41766: ^ task is: TASK: Create tap interface {{ interface }} 27712 1727096482.41772: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096482.41777: getting variables 27712 1727096482.41779: in VariableManager get_vars() 27712 1727096482.41826: Calling all_inventory to load vars for managed_node2 27712 1727096482.41829: Calling groups_inventory to load vars for managed_node2 27712 1727096482.41832: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.41844: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.41849: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.41852: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.42251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.42454: done with get_vars() 27712 1727096482.42465: done getting variables 27712 1727096482.42525: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096482.42645: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Monday 23 September 2024 09:01:22 -0400 (0:00:00.024) 0:00:08.120 ****** 27712 1727096482.42686: entering _queue_task() for managed_node2/command 27712 1727096482.42964: worker is 1 (out of 1 available) 27712 1727096482.42979: exiting _queue_task() for managed_node2/command 27712 1727096482.43105: done queuing things up, now waiting for results queue to drain 27712 1727096482.43107: waiting for pending results... 27712 1727096482.43332: running TaskExecutor() for managed_node2/TASK: Create tap interface ethtest0 27712 1727096482.43431: in run() - task 0afff68d-5257-cbc7-8716-000000000173 27712 1727096482.43436: variable 'ansible_search_path' from source: unknown 27712 1727096482.43438: variable 'ansible_search_path' from source: unknown 27712 1727096482.43462: calling self._execute() 27712 1727096482.43564: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.43578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.43592: variable 'omit' from source: magic vars 27712 1727096482.43963: variable 'ansible_distribution_major_version' from source: facts 27712 1727096482.44083: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096482.44234: variable 'type' from source: set_fact 27712 1727096482.44247: variable 'state' from source: include params 27712 1727096482.44269: variable 'interface' from source: set_fact 27712 1727096482.44282: variable 'current_interfaces' from source: set_fact 27712 1727096482.44299: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 27712 1727096482.44313: when evaluation is False, skipping this task 27712 1727096482.44325: _execute() done 27712 1727096482.44333: dumping result to json 27712 1727096482.44415: done dumping result, returning 27712 1727096482.44419: done running TaskExecutor() for managed_node2/TASK: Create tap interface ethtest0 [0afff68d-5257-cbc7-8716-000000000173] 27712 1727096482.44424: sending task result for task 0afff68d-5257-cbc7-8716-000000000173 27712 1727096482.44493: done sending task result for task 0afff68d-5257-cbc7-8716-000000000173 27712 1727096482.44497: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 27712 1727096482.44556: no more pending results, returning what we have 27712 1727096482.44561: results queue empty 27712 1727096482.44562: checking for any_errors_fatal 27712 1727096482.44570: done checking for any_errors_fatal 27712 1727096482.44571: checking for max_fail_percentage 27712 1727096482.44573: done checking for max_fail_percentage 27712 1727096482.44574: checking to see if all hosts have failed and the running result is not ok 27712 1727096482.44575: done checking to see if all hosts have failed 27712 1727096482.44575: getting the remaining hosts for this loop 27712 1727096482.44577: done getting the remaining hosts for this loop 27712 1727096482.44581: getting the next task for host managed_node2 27712 1727096482.44588: done getting next task for host managed_node2 27712 1727096482.44591: ^ task is: TASK: Delete tap interface {{ interface }} 27712 1727096482.44594: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096482.44599: getting variables 27712 1727096482.44600: in VariableManager get_vars() 27712 1727096482.44758: Calling all_inventory to load vars for managed_node2 27712 1727096482.44762: Calling groups_inventory to load vars for managed_node2 27712 1727096482.44765: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.44854: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.44857: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.44861: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.45155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.45312: done with get_vars() 27712 1727096482.45319: done getting variables 27712 1727096482.45358: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096482.45434: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Monday 23 September 2024 09:01:22 -0400 (0:00:00.027) 0:00:08.147 ****** 27712 1727096482.45454: entering _queue_task() for managed_node2/command 27712 1727096482.45636: worker is 1 (out of 1 available) 27712 1727096482.45649: exiting _queue_task() for managed_node2/command 27712 1727096482.45659: done queuing things up, now waiting for results queue to drain 27712 1727096482.45660: waiting for pending results... 27712 1727096482.45811: running TaskExecutor() for managed_node2/TASK: Delete tap interface ethtest0 27712 1727096482.45869: in run() - task 0afff68d-5257-cbc7-8716-000000000174 27712 1727096482.45885: variable 'ansible_search_path' from source: unknown 27712 1727096482.45889: variable 'ansible_search_path' from source: unknown 27712 1727096482.45917: calling self._execute() 27712 1727096482.45982: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.45987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.45996: variable 'omit' from source: magic vars 27712 1727096482.46249: variable 'ansible_distribution_major_version' from source: facts 27712 1727096482.46258: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096482.46390: variable 'type' from source: set_fact 27712 1727096482.46394: variable 'state' from source: include params 27712 1727096482.46397: variable 'interface' from source: set_fact 27712 1727096482.46402: variable 'current_interfaces' from source: set_fact 27712 1727096482.46409: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 27712 1727096482.46412: when evaluation is False, skipping this task 27712 1727096482.46414: _execute() done 27712 1727096482.46417: dumping result to json 27712 1727096482.46419: done dumping result, returning 27712 1727096482.46429: done running TaskExecutor() for managed_node2/TASK: Delete tap interface ethtest0 [0afff68d-5257-cbc7-8716-000000000174] 27712 1727096482.46431: sending task result for task 0afff68d-5257-cbc7-8716-000000000174 27712 1727096482.46508: done sending task result for task 0afff68d-5257-cbc7-8716-000000000174 27712 1727096482.46511: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27712 1727096482.46584: no more pending results, returning what we have 27712 1727096482.46587: results queue empty 27712 1727096482.46588: checking for any_errors_fatal 27712 1727096482.46592: done checking for any_errors_fatal 27712 1727096482.46593: checking for max_fail_percentage 27712 1727096482.46595: done checking for max_fail_percentage 27712 1727096482.46595: checking to see if all hosts have failed and the running result is not ok 27712 1727096482.46596: done checking to see if all hosts have failed 27712 1727096482.46597: getting the remaining hosts for this loop 27712 1727096482.46598: done getting the remaining hosts for this loop 27712 1727096482.46600: getting the next task for host managed_node2 27712 1727096482.46607: done getting next task for host managed_node2 27712 1727096482.46609: ^ task is: TASK: Assert device is present 27712 1727096482.46611: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096482.46614: getting variables 27712 1727096482.46615: in VariableManager get_vars() 27712 1727096482.46647: Calling all_inventory to load vars for managed_node2 27712 1727096482.46649: Calling groups_inventory to load vars for managed_node2 27712 1727096482.46651: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.46658: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.46660: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.46662: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.46775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.46888: done with get_vars() 27712 1727096482.46895: done getting variables TASK [Assert device is present] ************************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:21 Monday 23 September 2024 09:01:22 -0400 (0:00:00.015) 0:00:08.162 ****** 27712 1727096482.46956: entering _queue_task() for managed_node2/include_tasks 27712 1727096482.47130: worker is 1 (out of 1 available) 27712 1727096482.47143: exiting _queue_task() for managed_node2/include_tasks 27712 1727096482.47154: done queuing things up, now waiting for results queue to drain 27712 1727096482.47155: waiting for pending results... 27712 1727096482.47301: running TaskExecutor() for managed_node2/TASK: Assert device is present 27712 1727096482.47352: in run() - task 0afff68d-5257-cbc7-8716-00000000000e 27712 1727096482.47363: variable 'ansible_search_path' from source: unknown 27712 1727096482.47397: calling self._execute() 27712 1727096482.47454: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.47457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.47466: variable 'omit' from source: magic vars 27712 1727096482.47714: variable 'ansible_distribution_major_version' from source: facts 27712 1727096482.47723: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096482.47728: _execute() done 27712 1727096482.47732: dumping result to json 27712 1727096482.47734: done dumping result, returning 27712 1727096482.47741: done running TaskExecutor() for managed_node2/TASK: Assert device is present [0afff68d-5257-cbc7-8716-00000000000e] 27712 1727096482.47746: sending task result for task 0afff68d-5257-cbc7-8716-00000000000e 27712 1727096482.47825: done sending task result for task 0afff68d-5257-cbc7-8716-00000000000e 27712 1727096482.47828: WORKER PROCESS EXITING 27712 1727096482.47855: no more pending results, returning what we have 27712 1727096482.47859: in VariableManager get_vars() 27712 1727096482.47899: Calling all_inventory to load vars for managed_node2 27712 1727096482.47902: Calling groups_inventory to load vars for managed_node2 27712 1727096482.47904: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.47913: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.47915: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.47918: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.48065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.48176: done with get_vars() 27712 1727096482.48181: variable 'ansible_search_path' from source: unknown 27712 1727096482.48190: we have included files to process 27712 1727096482.48190: generating all_blocks data 27712 1727096482.48191: done generating all_blocks data 27712 1727096482.48194: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27712 1727096482.48195: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27712 1727096482.48196: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27712 1727096482.48299: in VariableManager get_vars() 27712 1727096482.48313: done with get_vars() 27712 1727096482.48382: done processing included file 27712 1727096482.48384: iterating over new_blocks loaded from include file 27712 1727096482.48385: in VariableManager get_vars() 27712 1727096482.48397: done with get_vars() 27712 1727096482.48399: filtering new block on tags 27712 1727096482.48409: done filtering new block on tags 27712 1727096482.48411: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 27712 1727096482.48414: extending task lists for all hosts with included blocks 27712 1727096482.48751: done extending task lists 27712 1727096482.48752: done processing included files 27712 1727096482.48753: results queue empty 27712 1727096482.48753: checking for any_errors_fatal 27712 1727096482.48755: done checking for any_errors_fatal 27712 1727096482.48756: checking for max_fail_percentage 27712 1727096482.48756: done checking for max_fail_percentage 27712 1727096482.48757: checking to see if all hosts have failed and the running result is not ok 27712 1727096482.48757: done checking to see if all hosts have failed 27712 1727096482.48758: getting the remaining hosts for this loop 27712 1727096482.48759: done getting the remaining hosts for this loop 27712 1727096482.48760: getting the next task for host managed_node2 27712 1727096482.48762: done getting next task for host managed_node2 27712 1727096482.48764: ^ task is: TASK: Include the task 'get_interface_stat.yml' 27712 1727096482.48766: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096482.48769: getting variables 27712 1727096482.48769: in VariableManager get_vars() 27712 1727096482.48779: Calling all_inventory to load vars for managed_node2 27712 1727096482.48781: Calling groups_inventory to load vars for managed_node2 27712 1727096482.48782: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.48785: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.48787: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.48788: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.49020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.49140: done with get_vars() 27712 1727096482.49147: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 09:01:22 -0400 (0:00:00.022) 0:00:08.185 ****** 27712 1727096482.49198: entering _queue_task() for managed_node2/include_tasks 27712 1727096482.49395: worker is 1 (out of 1 available) 27712 1727096482.49406: exiting _queue_task() for managed_node2/include_tasks 27712 1727096482.49418: done queuing things up, now waiting for results queue to drain 27712 1727096482.49420: waiting for pending results... 27712 1727096482.49572: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 27712 1727096482.49642: in run() - task 0afff68d-5257-cbc7-8716-000000000214 27712 1727096482.49655: variable 'ansible_search_path' from source: unknown 27712 1727096482.49658: variable 'ansible_search_path' from source: unknown 27712 1727096482.49688: calling self._execute() 27712 1727096482.49747: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.49750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.49763: variable 'omit' from source: magic vars 27712 1727096482.50105: variable 'ansible_distribution_major_version' from source: facts 27712 1727096482.50109: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096482.50111: _execute() done 27712 1727096482.50114: dumping result to json 27712 1727096482.50116: done dumping result, returning 27712 1727096482.50118: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-cbc7-8716-000000000214] 27712 1727096482.50120: sending task result for task 0afff68d-5257-cbc7-8716-000000000214 27712 1727096482.50181: done sending task result for task 0afff68d-5257-cbc7-8716-000000000214 27712 1727096482.50184: WORKER PROCESS EXITING 27712 1727096482.50224: no more pending results, returning what we have 27712 1727096482.50227: in VariableManager get_vars() 27712 1727096482.50264: Calling all_inventory to load vars for managed_node2 27712 1727096482.50266: Calling groups_inventory to load vars for managed_node2 27712 1727096482.50274: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.50283: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.50285: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.50287: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.50404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.50515: done with get_vars() 27712 1727096482.50520: variable 'ansible_search_path' from source: unknown 27712 1727096482.50521: variable 'ansible_search_path' from source: unknown 27712 1727096482.50545: we have included files to process 27712 1727096482.50546: generating all_blocks data 27712 1727096482.50548: done generating all_blocks data 27712 1727096482.50548: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27712 1727096482.50549: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27712 1727096482.50550: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27712 1727096482.50701: done processing included file 27712 1727096482.50703: iterating over new_blocks loaded from include file 27712 1727096482.50704: in VariableManager get_vars() 27712 1727096482.50716: done with get_vars() 27712 1727096482.50717: filtering new block on tags 27712 1727096482.50726: done filtering new block on tags 27712 1727096482.50727: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 27712 1727096482.50730: extending task lists for all hosts with included blocks 27712 1727096482.50789: done extending task lists 27712 1727096482.50790: done processing included files 27712 1727096482.50791: results queue empty 27712 1727096482.50791: checking for any_errors_fatal 27712 1727096482.50793: done checking for any_errors_fatal 27712 1727096482.50793: checking for max_fail_percentage 27712 1727096482.50794: done checking for max_fail_percentage 27712 1727096482.50794: checking to see if all hosts have failed and the running result is not ok 27712 1727096482.50795: done checking to see if all hosts have failed 27712 1727096482.50795: getting the remaining hosts for this loop 27712 1727096482.50796: done getting the remaining hosts for this loop 27712 1727096482.50797: getting the next task for host managed_node2 27712 1727096482.50800: done getting next task for host managed_node2 27712 1727096482.50801: ^ task is: TASK: Get stat for interface {{ interface }} 27712 1727096482.50804: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096482.50805: getting variables 27712 1727096482.50806: in VariableManager get_vars() 27712 1727096482.50815: Calling all_inventory to load vars for managed_node2 27712 1727096482.50817: Calling groups_inventory to load vars for managed_node2 27712 1727096482.50818: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.50822: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.50823: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.50825: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.50924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.51028: done with get_vars() 27712 1727096482.51035: done getting variables 27712 1727096482.51136: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 09:01:22 -0400 (0:00:00.019) 0:00:08.204 ****** 27712 1727096482.51157: entering _queue_task() for managed_node2/stat 27712 1727096482.51340: worker is 1 (out of 1 available) 27712 1727096482.51353: exiting _queue_task() for managed_node2/stat 27712 1727096482.51364: done queuing things up, now waiting for results queue to drain 27712 1727096482.51365: waiting for pending results... 27712 1727096482.51523: running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 27712 1727096482.51595: in run() - task 0afff68d-5257-cbc7-8716-000000000267 27712 1727096482.51604: variable 'ansible_search_path' from source: unknown 27712 1727096482.51608: variable 'ansible_search_path' from source: unknown 27712 1727096482.51635: calling self._execute() 27712 1727096482.51698: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.51702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.51713: variable 'omit' from source: magic vars 27712 1727096482.51966: variable 'ansible_distribution_major_version' from source: facts 27712 1727096482.51979: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096482.51985: variable 'omit' from source: magic vars 27712 1727096482.52014: variable 'omit' from source: magic vars 27712 1727096482.52081: variable 'interface' from source: set_fact 27712 1727096482.52095: variable 'omit' from source: magic vars 27712 1727096482.52125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096482.52151: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096482.52166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096482.52183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096482.52193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096482.52216: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096482.52219: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.52222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.52291: Set connection var ansible_connection to ssh 27712 1727096482.52297: Set connection var ansible_pipelining to False 27712 1727096482.52304: Set connection var ansible_timeout to 10 27712 1727096482.52306: Set connection var ansible_shell_type to sh 27712 1727096482.52315: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096482.52317: Set connection var ansible_shell_executable to /bin/sh 27712 1727096482.52333: variable 'ansible_shell_executable' from source: unknown 27712 1727096482.52337: variable 'ansible_connection' from source: unknown 27712 1727096482.52339: variable 'ansible_module_compression' from source: unknown 27712 1727096482.52341: variable 'ansible_shell_type' from source: unknown 27712 1727096482.52344: variable 'ansible_shell_executable' from source: unknown 27712 1727096482.52346: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.52348: variable 'ansible_pipelining' from source: unknown 27712 1727096482.52350: variable 'ansible_timeout' from source: unknown 27712 1727096482.52356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.52500: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096482.52509: variable 'omit' from source: magic vars 27712 1727096482.52514: starting attempt loop 27712 1727096482.52517: running the handler 27712 1727096482.52533: _low_level_execute_command(): starting 27712 1727096482.52537: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096482.53065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096482.53073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096482.53076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096482.53080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096482.53118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096482.53148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096482.53198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096482.54853: stdout chunk (state=3): >>>/root <<< 27712 1727096482.55007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096482.55010: stdout chunk (state=3): >>><<< 27712 1727096482.55012: stderr chunk (state=3): >>><<< 27712 1727096482.55034: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096482.55056: _low_level_execute_command(): starting 27712 1727096482.55138: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096482.550411-28198-15222415995775 `" && echo ansible-tmp-1727096482.550411-28198-15222415995775="` echo /root/.ansible/tmp/ansible-tmp-1727096482.550411-28198-15222415995775 `" ) && sleep 0' 27712 1727096482.55625: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096482.55641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096482.55657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096482.55677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096482.55789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096482.55839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096482.55907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096482.57858: stdout chunk (state=3): >>>ansible-tmp-1727096482.550411-28198-15222415995775=/root/.ansible/tmp/ansible-tmp-1727096482.550411-28198-15222415995775 <<< 27712 1727096482.58012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096482.58015: stdout chunk (state=3): >>><<< 27712 1727096482.58018: stderr chunk (state=3): >>><<< 27712 1727096482.58035: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096482.550411-28198-15222415995775=/root/.ansible/tmp/ansible-tmp-1727096482.550411-28198-15222415995775 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096482.58117: variable 'ansible_module_compression' from source: unknown 27712 1727096482.58230: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27712 1727096482.58233: variable 'ansible_facts' from source: unknown 27712 1727096482.58776: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096482.550411-28198-15222415995775/AnsiballZ_stat.py 27712 1727096482.58804: Sending initial data 27712 1727096482.58814: Sent initial data (151 bytes) 27712 1727096482.59973: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096482.59993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096482.60049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096482.60104: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096482.60315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096482.60342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096482.61942: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096482.61980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096482.62008: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmp_x__zviu /root/.ansible/tmp/ansible-tmp-1727096482.550411-28198-15222415995775/AnsiballZ_stat.py <<< 27712 1727096482.62039: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096482.550411-28198-15222415995775/AnsiballZ_stat.py" <<< 27712 1727096482.62082: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmp_x__zviu" to remote "/root/.ansible/tmp/ansible-tmp-1727096482.550411-28198-15222415995775/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096482.550411-28198-15222415995775/AnsiballZ_stat.py" <<< 27712 1727096482.62956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096482.63023: stderr chunk (state=3): >>><<< 27712 1727096482.63027: stdout chunk (state=3): >>><<< 27712 1727096482.63281: done transferring module to remote 27712 1727096482.63284: _low_level_execute_command(): starting 27712 1727096482.63287: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096482.550411-28198-15222415995775/ /root/.ansible/tmp/ansible-tmp-1727096482.550411-28198-15222415995775/AnsiballZ_stat.py && sleep 0' 27712 1727096482.64877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096482.64904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096482.64939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096482.65099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096482.66919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096482.66922: stdout chunk (state=3): >>><<< 27712 1727096482.66924: stderr chunk (state=3): >>><<< 27712 1727096482.66948: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096482.66983: _low_level_execute_command(): starting 27712 1727096482.66994: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096482.550411-28198-15222415995775/AnsiballZ_stat.py && sleep 0' 27712 1727096482.68346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096482.68398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096482.68434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096482.68537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096482.68637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096482.68708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096482.68727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096482.68764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096482.68914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096482.84078: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30410, "dev": 23, "nlink": 1, "atime": 1727096481.1017663, "mtime": 1727096481.1017663, "ctime": 1727096481.1017663, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27712 1727096482.85732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096482.85736: stdout chunk (state=3): >>><<< 27712 1727096482.85738: stderr chunk (state=3): >>><<< 27712 1727096482.85741: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30410, "dev": 23, "nlink": 1, "atime": 1727096481.1017663, "mtime": 1727096481.1017663, "ctime": 1727096481.1017663, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096482.85747: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096482.550411-28198-15222415995775/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096482.85760: _low_level_execute_command(): starting 27712 1727096482.85772: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096482.550411-28198-15222415995775/ > /dev/null 2>&1 && sleep 0' 27712 1727096482.86422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096482.86437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096482.86451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096482.86484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096482.86508: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096482.86594: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096482.86627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096482.86695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096482.88605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096482.88613: stdout chunk (state=3): >>><<< 27712 1727096482.88615: stderr chunk (state=3): >>><<< 27712 1727096482.88633: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096482.88644: handler run complete 27712 1727096482.88714: attempt loop complete, returning result 27712 1727096482.88718: _execute() done 27712 1727096482.88776: dumping result to json 27712 1727096482.88779: done dumping result, returning 27712 1727096482.88781: done running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 [0afff68d-5257-cbc7-8716-000000000267] 27712 1727096482.88784: sending task result for task 0afff68d-5257-cbc7-8716-000000000267 ok: [managed_node2] => { "changed": false, "stat": { "atime": 1727096481.1017663, "block_size": 4096, "blocks": 0, "ctime": 1727096481.1017663, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30410, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1727096481.1017663, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 27712 1727096482.89203: no more pending results, returning what we have 27712 1727096482.89207: results queue empty 27712 1727096482.89208: checking for any_errors_fatal 27712 1727096482.89209: done checking for any_errors_fatal 27712 1727096482.89210: checking for max_fail_percentage 27712 1727096482.89211: done checking for max_fail_percentage 27712 1727096482.89212: checking to see if all hosts have failed and the running result is not ok 27712 1727096482.89213: done checking to see if all hosts have failed 27712 1727096482.89214: getting the remaining hosts for this loop 27712 1727096482.89215: done getting the remaining hosts for this loop 27712 1727096482.89219: getting the next task for host managed_node2 27712 1727096482.89227: done getting next task for host managed_node2 27712 1727096482.89229: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 27712 1727096482.89232: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096482.89236: getting variables 27712 1727096482.89238: in VariableManager get_vars() 27712 1727096482.89326: Calling all_inventory to load vars for managed_node2 27712 1727096482.89330: Calling groups_inventory to load vars for managed_node2 27712 1727096482.89333: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.89340: done sending task result for task 0afff68d-5257-cbc7-8716-000000000267 27712 1727096482.89342: WORKER PROCESS EXITING 27712 1727096482.89353: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.89356: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.89360: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.89709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.89969: done with get_vars() 27712 1727096482.89984: done getting variables 27712 1727096482.90088: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 27712 1727096482.90215: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 09:01:22 -0400 (0:00:00.390) 0:00:08.595 ****** 27712 1727096482.90246: entering _queue_task() for managed_node2/assert 27712 1727096482.90248: Creating lock for assert 27712 1727096482.90563: worker is 1 (out of 1 available) 27712 1727096482.90579: exiting _queue_task() for managed_node2/assert 27712 1727096482.90590: done queuing things up, now waiting for results queue to drain 27712 1727096482.90591: waiting for pending results... 27712 1727096482.90856: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'ethtest0' 27712 1727096482.90962: in run() - task 0afff68d-5257-cbc7-8716-000000000215 27712 1727096482.90983: variable 'ansible_search_path' from source: unknown 27712 1727096482.90990: variable 'ansible_search_path' from source: unknown 27712 1727096482.91023: calling self._execute() 27712 1727096482.91117: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.91129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.91143: variable 'omit' from source: magic vars 27712 1727096482.91531: variable 'ansible_distribution_major_version' from source: facts 27712 1727096482.91549: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096482.91560: variable 'omit' from source: magic vars 27712 1727096482.91614: variable 'omit' from source: magic vars 27712 1727096482.91722: variable 'interface' from source: set_fact 27712 1727096482.91744: variable 'omit' from source: magic vars 27712 1727096482.91791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096482.91840: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096482.91864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096482.91893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096482.91937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096482.91954: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096482.91963: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.91978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.92153: Set connection var ansible_connection to ssh 27712 1727096482.92156: Set connection var ansible_pipelining to False 27712 1727096482.92158: Set connection var ansible_timeout to 10 27712 1727096482.92161: Set connection var ansible_shell_type to sh 27712 1727096482.92163: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096482.92165: Set connection var ansible_shell_executable to /bin/sh 27712 1727096482.92169: variable 'ansible_shell_executable' from source: unknown 27712 1727096482.92174: variable 'ansible_connection' from source: unknown 27712 1727096482.92179: variable 'ansible_module_compression' from source: unknown 27712 1727096482.92187: variable 'ansible_shell_type' from source: unknown 27712 1727096482.92194: variable 'ansible_shell_executable' from source: unknown 27712 1727096482.92201: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.92208: variable 'ansible_pipelining' from source: unknown 27712 1727096482.92215: variable 'ansible_timeout' from source: unknown 27712 1727096482.92222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.92384: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096482.92481: variable 'omit' from source: magic vars 27712 1727096482.92484: starting attempt loop 27712 1727096482.92487: running the handler 27712 1727096482.92550: variable 'interface_stat' from source: set_fact 27712 1727096482.92578: Evaluated conditional (interface_stat.stat.exists): True 27712 1727096482.92605: handler run complete 27712 1727096482.92622: attempt loop complete, returning result 27712 1727096482.92704: _execute() done 27712 1727096482.92707: dumping result to json 27712 1727096482.92710: done dumping result, returning 27712 1727096482.92712: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'ethtest0' [0afff68d-5257-cbc7-8716-000000000215] 27712 1727096482.92714: sending task result for task 0afff68d-5257-cbc7-8716-000000000215 27712 1727096482.92787: done sending task result for task 0afff68d-5257-cbc7-8716-000000000215 27712 1727096482.92790: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27712 1727096482.92842: no more pending results, returning what we have 27712 1727096482.92845: results queue empty 27712 1727096482.92846: checking for any_errors_fatal 27712 1727096482.92853: done checking for any_errors_fatal 27712 1727096482.92854: checking for max_fail_percentage 27712 1727096482.92855: done checking for max_fail_percentage 27712 1727096482.92857: checking to see if all hosts have failed and the running result is not ok 27712 1727096482.92858: done checking to see if all hosts have failed 27712 1727096482.92859: getting the remaining hosts for this loop 27712 1727096482.92861: done getting the remaining hosts for this loop 27712 1727096482.92864: getting the next task for host managed_node2 27712 1727096482.92876: done getting next task for host managed_node2 27712 1727096482.92879: ^ task is: TASK: Set interface1 27712 1727096482.92881: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096482.92885: getting variables 27712 1727096482.92887: in VariableManager get_vars() 27712 1727096482.92930: Calling all_inventory to load vars for managed_node2 27712 1727096482.92933: Calling groups_inventory to load vars for managed_node2 27712 1727096482.92935: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.92947: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.92951: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.92954: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.93446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.93686: done with get_vars() 27712 1727096482.93697: done getting variables 27712 1727096482.93759: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set interface1] ********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:23 Monday 23 September 2024 09:01:22 -0400 (0:00:00.035) 0:00:08.631 ****** 27712 1727096482.93791: entering _queue_task() for managed_node2/set_fact 27712 1727096482.94144: worker is 1 (out of 1 available) 27712 1727096482.94162: exiting _queue_task() for managed_node2/set_fact 27712 1727096482.94177: done queuing things up, now waiting for results queue to drain 27712 1727096482.94178: waiting for pending results... 27712 1727096482.94347: running TaskExecutor() for managed_node2/TASK: Set interface1 27712 1727096482.94406: in run() - task 0afff68d-5257-cbc7-8716-00000000000f 27712 1727096482.94418: variable 'ansible_search_path' from source: unknown 27712 1727096482.94447: calling self._execute() 27712 1727096482.94517: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.94521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.94530: variable 'omit' from source: magic vars 27712 1727096482.94812: variable 'ansible_distribution_major_version' from source: facts 27712 1727096482.94821: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096482.94828: variable 'omit' from source: magic vars 27712 1727096482.94850: variable 'omit' from source: magic vars 27712 1727096482.94871: variable 'interface1' from source: play vars 27712 1727096482.94926: variable 'interface1' from source: play vars 27712 1727096482.94944: variable 'omit' from source: magic vars 27712 1727096482.94990: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096482.95011: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096482.95028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096482.95040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096482.95053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096482.95076: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096482.95079: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.95082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.95146: Set connection var ansible_connection to ssh 27712 1727096482.95153: Set connection var ansible_pipelining to False 27712 1727096482.95157: Set connection var ansible_timeout to 10 27712 1727096482.95164: Set connection var ansible_shell_type to sh 27712 1727096482.95172: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096482.95175: Set connection var ansible_shell_executable to /bin/sh 27712 1727096482.95194: variable 'ansible_shell_executable' from source: unknown 27712 1727096482.95197: variable 'ansible_connection' from source: unknown 27712 1727096482.95199: variable 'ansible_module_compression' from source: unknown 27712 1727096482.95202: variable 'ansible_shell_type' from source: unknown 27712 1727096482.95204: variable 'ansible_shell_executable' from source: unknown 27712 1727096482.95206: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.95208: variable 'ansible_pipelining' from source: unknown 27712 1727096482.95210: variable 'ansible_timeout' from source: unknown 27712 1727096482.95215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.95325: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096482.95333: variable 'omit' from source: magic vars 27712 1727096482.95338: starting attempt loop 27712 1727096482.95341: running the handler 27712 1727096482.95350: handler run complete 27712 1727096482.95358: attempt loop complete, returning result 27712 1727096482.95361: _execute() done 27712 1727096482.95363: dumping result to json 27712 1727096482.95366: done dumping result, returning 27712 1727096482.95375: done running TaskExecutor() for managed_node2/TASK: Set interface1 [0afff68d-5257-cbc7-8716-00000000000f] 27712 1727096482.95385: sending task result for task 0afff68d-5257-cbc7-8716-00000000000f 27712 1727096482.95455: done sending task result for task 0afff68d-5257-cbc7-8716-00000000000f 27712 1727096482.95457: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "interface": "ethtest1" }, "changed": false } 27712 1727096482.95535: no more pending results, returning what we have 27712 1727096482.95537: results queue empty 27712 1727096482.95538: checking for any_errors_fatal 27712 1727096482.95541: done checking for any_errors_fatal 27712 1727096482.95542: checking for max_fail_percentage 27712 1727096482.95544: done checking for max_fail_percentage 27712 1727096482.95545: checking to see if all hosts have failed and the running result is not ok 27712 1727096482.95545: done checking to see if all hosts have failed 27712 1727096482.95546: getting the remaining hosts for this loop 27712 1727096482.95547: done getting the remaining hosts for this loop 27712 1727096482.95550: getting the next task for host managed_node2 27712 1727096482.95554: done getting next task for host managed_node2 27712 1727096482.95556: ^ task is: TASK: Show interfaces 27712 1727096482.95558: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096482.95561: getting variables 27712 1727096482.95562: in VariableManager get_vars() 27712 1727096482.95599: Calling all_inventory to load vars for managed_node2 27712 1727096482.95602: Calling groups_inventory to load vars for managed_node2 27712 1727096482.95604: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.95612: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.95614: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.95617: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.95780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.95893: done with get_vars() 27712 1727096482.95900: done getting variables TASK [Show interfaces] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:26 Monday 23 September 2024 09:01:22 -0400 (0:00:00.021) 0:00:08.653 ****** 27712 1727096482.95978: entering _queue_task() for managed_node2/include_tasks 27712 1727096482.96292: worker is 1 (out of 1 available) 27712 1727096482.96303: exiting _queue_task() for managed_node2/include_tasks 27712 1727096482.96314: done queuing things up, now waiting for results queue to drain 27712 1727096482.96316: waiting for pending results... 27712 1727096482.96586: running TaskExecutor() for managed_node2/TASK: Show interfaces 27712 1727096482.96594: in run() - task 0afff68d-5257-cbc7-8716-000000000010 27712 1727096482.96597: variable 'ansible_search_path' from source: unknown 27712 1727096482.96601: calling self._execute() 27712 1727096482.96689: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.96702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.96716: variable 'omit' from source: magic vars 27712 1727096482.97409: variable 'ansible_distribution_major_version' from source: facts 27712 1727096482.97413: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096482.97415: _execute() done 27712 1727096482.97418: dumping result to json 27712 1727096482.97420: done dumping result, returning 27712 1727096482.97422: done running TaskExecutor() for managed_node2/TASK: Show interfaces [0afff68d-5257-cbc7-8716-000000000010] 27712 1727096482.97425: sending task result for task 0afff68d-5257-cbc7-8716-000000000010 27712 1727096482.97494: done sending task result for task 0afff68d-5257-cbc7-8716-000000000010 27712 1727096482.97497: WORKER PROCESS EXITING 27712 1727096482.97522: no more pending results, returning what we have 27712 1727096482.97526: in VariableManager get_vars() 27712 1727096482.97569: Calling all_inventory to load vars for managed_node2 27712 1727096482.97575: Calling groups_inventory to load vars for managed_node2 27712 1727096482.97577: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.97589: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.97592: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.97595: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.97832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.98042: done with get_vars() 27712 1727096482.98050: variable 'ansible_search_path' from source: unknown 27712 1727096482.98064: we have included files to process 27712 1727096482.98065: generating all_blocks data 27712 1727096482.98070: done generating all_blocks data 27712 1727096482.98075: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27712 1727096482.98076: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27712 1727096482.98079: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27712 1727096482.98187: in VariableManager get_vars() 27712 1727096482.98216: done with get_vars() 27712 1727096482.98332: done processing included file 27712 1727096482.98334: iterating over new_blocks loaded from include file 27712 1727096482.98335: in VariableManager get_vars() 27712 1727096482.98346: done with get_vars() 27712 1727096482.98347: filtering new block on tags 27712 1727096482.98357: done filtering new block on tags 27712 1727096482.98359: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 27712 1727096482.98362: extending task lists for all hosts with included blocks 27712 1727096482.98790: done extending task lists 27712 1727096482.98792: done processing included files 27712 1727096482.98792: results queue empty 27712 1727096482.98792: checking for any_errors_fatal 27712 1727096482.98794: done checking for any_errors_fatal 27712 1727096482.98795: checking for max_fail_percentage 27712 1727096482.98796: done checking for max_fail_percentage 27712 1727096482.98796: checking to see if all hosts have failed and the running result is not ok 27712 1727096482.98797: done checking to see if all hosts have failed 27712 1727096482.98797: getting the remaining hosts for this loop 27712 1727096482.98798: done getting the remaining hosts for this loop 27712 1727096482.98799: getting the next task for host managed_node2 27712 1727096482.98802: done getting next task for host managed_node2 27712 1727096482.98803: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 27712 1727096482.98805: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096482.98806: getting variables 27712 1727096482.98807: in VariableManager get_vars() 27712 1727096482.98816: Calling all_inventory to load vars for managed_node2 27712 1727096482.98817: Calling groups_inventory to load vars for managed_node2 27712 1727096482.98818: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096482.98823: Calling all_plugins_play to load vars for managed_node2 27712 1727096482.98824: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096482.98826: Calling groups_plugins_play to load vars for managed_node2 27712 1727096482.98911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096482.99018: done with get_vars() 27712 1727096482.99025: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 09:01:22 -0400 (0:00:00.030) 0:00:08.684 ****** 27712 1727096482.99075: entering _queue_task() for managed_node2/include_tasks 27712 1727096482.99287: worker is 1 (out of 1 available) 27712 1727096482.99298: exiting _queue_task() for managed_node2/include_tasks 27712 1727096482.99309: done queuing things up, now waiting for results queue to drain 27712 1727096482.99310: waiting for pending results... 27712 1727096482.99496: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 27712 1727096482.99580: in run() - task 0afff68d-5257-cbc7-8716-000000000282 27712 1727096482.99583: variable 'ansible_search_path' from source: unknown 27712 1727096482.99586: variable 'ansible_search_path' from source: unknown 27712 1727096482.99619: calling self._execute() 27712 1727096482.99705: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096482.99717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096482.99730: variable 'omit' from source: magic vars 27712 1727096483.00233: variable 'ansible_distribution_major_version' from source: facts 27712 1727096483.00236: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096483.00238: _execute() done 27712 1727096483.00240: dumping result to json 27712 1727096483.00242: done dumping result, returning 27712 1727096483.00245: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-cbc7-8716-000000000282] 27712 1727096483.00248: sending task result for task 0afff68d-5257-cbc7-8716-000000000282 27712 1727096483.00308: done sending task result for task 0afff68d-5257-cbc7-8716-000000000282 27712 1727096483.00311: WORKER PROCESS EXITING 27712 1727096483.00353: no more pending results, returning what we have 27712 1727096483.00358: in VariableManager get_vars() 27712 1727096483.00410: Calling all_inventory to load vars for managed_node2 27712 1727096483.00414: Calling groups_inventory to load vars for managed_node2 27712 1727096483.00417: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.00543: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.00547: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.00551: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.00803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.01031: done with get_vars() 27712 1727096483.01042: variable 'ansible_search_path' from source: unknown 27712 1727096483.01043: variable 'ansible_search_path' from source: unknown 27712 1727096483.01080: we have included files to process 27712 1727096483.01081: generating all_blocks data 27712 1727096483.01082: done generating all_blocks data 27712 1727096483.01083: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27712 1727096483.01084: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27712 1727096483.01085: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27712 1727096483.01271: done processing included file 27712 1727096483.01274: iterating over new_blocks loaded from include file 27712 1727096483.01275: in VariableManager get_vars() 27712 1727096483.01287: done with get_vars() 27712 1727096483.01288: filtering new block on tags 27712 1727096483.01300: done filtering new block on tags 27712 1727096483.01301: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 27712 1727096483.01304: extending task lists for all hosts with included blocks 27712 1727096483.01361: done extending task lists 27712 1727096483.01362: done processing included files 27712 1727096483.01362: results queue empty 27712 1727096483.01362: checking for any_errors_fatal 27712 1727096483.01364: done checking for any_errors_fatal 27712 1727096483.01365: checking for max_fail_percentage 27712 1727096483.01365: done checking for max_fail_percentage 27712 1727096483.01366: checking to see if all hosts have failed and the running result is not ok 27712 1727096483.01366: done checking to see if all hosts have failed 27712 1727096483.01368: getting the remaining hosts for this loop 27712 1727096483.01369: done getting the remaining hosts for this loop 27712 1727096483.01372: getting the next task for host managed_node2 27712 1727096483.01374: done getting next task for host managed_node2 27712 1727096483.01376: ^ task is: TASK: Gather current interface info 27712 1727096483.01378: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096483.01380: getting variables 27712 1727096483.01380: in VariableManager get_vars() 27712 1727096483.01389: Calling all_inventory to load vars for managed_node2 27712 1727096483.01390: Calling groups_inventory to load vars for managed_node2 27712 1727096483.01391: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.01395: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.01396: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.01398: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.01483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.01593: done with get_vars() 27712 1727096483.01599: done getting variables 27712 1727096483.01623: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 09:01:23 -0400 (0:00:00.025) 0:00:08.709 ****** 27712 1727096483.01642: entering _queue_task() for managed_node2/command 27712 1727096483.01859: worker is 1 (out of 1 available) 27712 1727096483.01873: exiting _queue_task() for managed_node2/command 27712 1727096483.01885: done queuing things up, now waiting for results queue to drain 27712 1727096483.01887: waiting for pending results... 27712 1727096483.02042: running TaskExecutor() for managed_node2/TASK: Gather current interface info 27712 1727096483.02106: in run() - task 0afff68d-5257-cbc7-8716-0000000002e0 27712 1727096483.02118: variable 'ansible_search_path' from source: unknown 27712 1727096483.02159: variable 'ansible_search_path' from source: unknown 27712 1727096483.02376: calling self._execute() 27712 1727096483.02380: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.02382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.02386: variable 'omit' from source: magic vars 27712 1727096483.02732: variable 'ansible_distribution_major_version' from source: facts 27712 1727096483.02780: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096483.02814: variable 'omit' from source: magic vars 27712 1727096483.02888: variable 'omit' from source: magic vars 27712 1727096483.03013: variable 'omit' from source: magic vars 27712 1727096483.03093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096483.03145: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096483.03179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096483.03337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096483.03353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096483.03385: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096483.03388: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.03391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.03505: Set connection var ansible_connection to ssh 27712 1727096483.03513: Set connection var ansible_pipelining to False 27712 1727096483.03519: Set connection var ansible_timeout to 10 27712 1727096483.03521: Set connection var ansible_shell_type to sh 27712 1727096483.03529: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096483.03567: Set connection var ansible_shell_executable to /bin/sh 27712 1727096483.03594: variable 'ansible_shell_executable' from source: unknown 27712 1727096483.03597: variable 'ansible_connection' from source: unknown 27712 1727096483.03600: variable 'ansible_module_compression' from source: unknown 27712 1727096483.03603: variable 'ansible_shell_type' from source: unknown 27712 1727096483.03605: variable 'ansible_shell_executable' from source: unknown 27712 1727096483.03607: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.03609: variable 'ansible_pipelining' from source: unknown 27712 1727096483.03611: variable 'ansible_timeout' from source: unknown 27712 1727096483.03616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.03777: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096483.03789: variable 'omit' from source: magic vars 27712 1727096483.03793: starting attempt loop 27712 1727096483.03797: running the handler 27712 1727096483.03814: _low_level_execute_command(): starting 27712 1727096483.03821: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096483.04600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096483.04618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096483.04635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096483.04706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096483.06379: stdout chunk (state=3): >>>/root <<< 27712 1727096483.06471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096483.06506: stderr chunk (state=3): >>><<< 27712 1727096483.06510: stdout chunk (state=3): >>><<< 27712 1727096483.06530: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096483.06544: _low_level_execute_command(): starting 27712 1727096483.06549: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096483.065315-28237-212217341256755 `" && echo ansible-tmp-1727096483.065315-28237-212217341256755="` echo /root/.ansible/tmp/ansible-tmp-1727096483.065315-28237-212217341256755 `" ) && sleep 0' 27712 1727096483.07223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.07240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096483.07244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096483.07246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096483.09185: stdout chunk (state=3): >>>ansible-tmp-1727096483.065315-28237-212217341256755=/root/.ansible/tmp/ansible-tmp-1727096483.065315-28237-212217341256755 <<< 27712 1727096483.09290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096483.09300: stderr chunk (state=3): >>><<< 27712 1727096483.09303: stdout chunk (state=3): >>><<< 27712 1727096483.09320: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096483.065315-28237-212217341256755=/root/.ansible/tmp/ansible-tmp-1727096483.065315-28237-212217341256755 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096483.09346: variable 'ansible_module_compression' from source: unknown 27712 1727096483.09399: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096483.09427: variable 'ansible_facts' from source: unknown 27712 1727096483.09484: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096483.065315-28237-212217341256755/AnsiballZ_command.py 27712 1727096483.09584: Sending initial data 27712 1727096483.09588: Sent initial data (155 bytes) 27712 1727096483.10023: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096483.10027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.10030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096483.10032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096483.10034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.10086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096483.10090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096483.10129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096483.11761: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096483.11798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096483.11831: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpcac918xv /root/.ansible/tmp/ansible-tmp-1727096483.065315-28237-212217341256755/AnsiballZ_command.py <<< 27712 1727096483.11839: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096483.065315-28237-212217341256755/AnsiballZ_command.py" <<< 27712 1727096483.11865: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpcac918xv" to remote "/root/.ansible/tmp/ansible-tmp-1727096483.065315-28237-212217341256755/AnsiballZ_command.py" <<< 27712 1727096483.11878: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096483.065315-28237-212217341256755/AnsiballZ_command.py" <<< 27712 1727096483.12362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096483.12403: stderr chunk (state=3): >>><<< 27712 1727096483.12407: stdout chunk (state=3): >>><<< 27712 1727096483.12442: done transferring module to remote 27712 1727096483.12451: _low_level_execute_command(): starting 27712 1727096483.12456: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096483.065315-28237-212217341256755/ /root/.ansible/tmp/ansible-tmp-1727096483.065315-28237-212217341256755/AnsiballZ_command.py && sleep 0' 27712 1727096483.12914: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096483.12918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.12921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096483.12923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.12978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096483.12981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096483.12982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096483.13018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096483.14847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096483.14877: stderr chunk (state=3): >>><<< 27712 1727096483.14880: stdout chunk (state=3): >>><<< 27712 1727096483.14896: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096483.14899: _low_level_execute_command(): starting 27712 1727096483.14904: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096483.065315-28237-212217341256755/AnsiballZ_command.py && sleep 0' 27712 1727096483.15331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096483.15335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096483.15387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.15412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096483.15415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096483.15427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096483.15483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096483.31314: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:01:23.308750", "end": "2024-09-23 09:01:23.312090", "delta": "0:00:00.003340", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096483.32998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096483.33003: stdout chunk (state=3): >>><<< 27712 1727096483.33005: stderr chunk (state=3): >>><<< 27712 1727096483.33029: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:01:23.308750", "end": "2024-09-23 09:01:23.312090", "delta": "0:00:00.003340", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096483.33082: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096483.065315-28237-212217341256755/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096483.33107: _low_level_execute_command(): starting 27712 1727096483.33174: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096483.065315-28237-212217341256755/ > /dev/null 2>&1 && sleep 0' 27712 1727096483.33771: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096483.33787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096483.33800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096483.33819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096483.33932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096483.33962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096483.34028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096483.35930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096483.35953: stderr chunk (state=3): >>><<< 27712 1727096483.35962: stdout chunk (state=3): >>><<< 27712 1727096483.35988: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096483.36000: handler run complete 27712 1727096483.36082: Evaluated conditional (False): False 27712 1727096483.36085: attempt loop complete, returning result 27712 1727096483.36088: _execute() done 27712 1727096483.36090: dumping result to json 27712 1727096483.36092: done dumping result, returning 27712 1727096483.36094: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0afff68d-5257-cbc7-8716-0000000002e0] 27712 1727096483.36096: sending task result for task 0afff68d-5257-cbc7-8716-0000000002e0 27712 1727096483.36415: done sending task result for task 0afff68d-5257-cbc7-8716-0000000002e0 27712 1727096483.36418: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003340", "end": "2024-09-23 09:01:23.312090", "rc": 0, "start": "2024-09-23 09:01:23.308750" } STDOUT: bonding_masters eth0 ethtest0 lo peerethtest0 27712 1727096483.36597: no more pending results, returning what we have 27712 1727096483.36600: results queue empty 27712 1727096483.36601: checking for any_errors_fatal 27712 1727096483.36603: done checking for any_errors_fatal 27712 1727096483.36603: checking for max_fail_percentage 27712 1727096483.36605: done checking for max_fail_percentage 27712 1727096483.36606: checking to see if all hosts have failed and the running result is not ok 27712 1727096483.36607: done checking to see if all hosts have failed 27712 1727096483.36608: getting the remaining hosts for this loop 27712 1727096483.36609: done getting the remaining hosts for this loop 27712 1727096483.36612: getting the next task for host managed_node2 27712 1727096483.36619: done getting next task for host managed_node2 27712 1727096483.36621: ^ task is: TASK: Set current_interfaces 27712 1727096483.36626: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096483.36630: getting variables 27712 1727096483.36631: in VariableManager get_vars() 27712 1727096483.36786: Calling all_inventory to load vars for managed_node2 27712 1727096483.36789: Calling groups_inventory to load vars for managed_node2 27712 1727096483.36792: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.36801: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.36804: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.36807: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.36966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.37196: done with get_vars() 27712 1727096483.37206: done getting variables 27712 1727096483.37303: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 09:01:23 -0400 (0:00:00.356) 0:00:09.066 ****** 27712 1727096483.37340: entering _queue_task() for managed_node2/set_fact 27712 1727096483.37811: worker is 1 (out of 1 available) 27712 1727096483.37824: exiting _queue_task() for managed_node2/set_fact 27712 1727096483.37837: done queuing things up, now waiting for results queue to drain 27712 1727096483.37839: waiting for pending results... 27712 1727096483.38585: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 27712 1727096483.38590: in run() - task 0afff68d-5257-cbc7-8716-0000000002e1 27712 1727096483.38593: variable 'ansible_search_path' from source: unknown 27712 1727096483.38596: variable 'ansible_search_path' from source: unknown 27712 1727096483.38599: calling self._execute() 27712 1727096483.38744: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.38786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.38801: variable 'omit' from source: magic vars 27712 1727096483.39398: variable 'ansible_distribution_major_version' from source: facts 27712 1727096483.39418: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096483.39429: variable 'omit' from source: magic vars 27712 1727096483.39482: variable 'omit' from source: magic vars 27712 1727096483.39622: variable '_current_interfaces' from source: set_fact 27712 1727096483.39666: variable 'omit' from source: magic vars 27712 1727096483.39713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096483.39756: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096483.39974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096483.39977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096483.39980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096483.39982: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096483.39985: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.39987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.39989: Set connection var ansible_connection to ssh 27712 1727096483.39991: Set connection var ansible_pipelining to False 27712 1727096483.39993: Set connection var ansible_timeout to 10 27712 1727096483.39995: Set connection var ansible_shell_type to sh 27712 1727096483.40010: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096483.40020: Set connection var ansible_shell_executable to /bin/sh 27712 1727096483.40045: variable 'ansible_shell_executable' from source: unknown 27712 1727096483.40054: variable 'ansible_connection' from source: unknown 27712 1727096483.40062: variable 'ansible_module_compression' from source: unknown 27712 1727096483.40073: variable 'ansible_shell_type' from source: unknown 27712 1727096483.40081: variable 'ansible_shell_executable' from source: unknown 27712 1727096483.40088: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.40096: variable 'ansible_pipelining' from source: unknown 27712 1727096483.40103: variable 'ansible_timeout' from source: unknown 27712 1727096483.40117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.40264: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096483.40286: variable 'omit' from source: magic vars 27712 1727096483.40297: starting attempt loop 27712 1727096483.40305: running the handler 27712 1727096483.40321: handler run complete 27712 1727096483.40341: attempt loop complete, returning result 27712 1727096483.40348: _execute() done 27712 1727096483.40356: dumping result to json 27712 1727096483.40364: done dumping result, returning 27712 1727096483.40381: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0afff68d-5257-cbc7-8716-0000000002e1] 27712 1727096483.40439: sending task result for task 0afff68d-5257-cbc7-8716-0000000002e1 27712 1727096483.40510: done sending task result for task 0afff68d-5257-cbc7-8716-0000000002e1 27712 1727096483.40513: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "ethtest0", "lo", "peerethtest0" ] }, "changed": false } 27712 1727096483.40604: no more pending results, returning what we have 27712 1727096483.40607: results queue empty 27712 1727096483.40608: checking for any_errors_fatal 27712 1727096483.40617: done checking for any_errors_fatal 27712 1727096483.40618: checking for max_fail_percentage 27712 1727096483.40620: done checking for max_fail_percentage 27712 1727096483.40621: checking to see if all hosts have failed and the running result is not ok 27712 1727096483.40621: done checking to see if all hosts have failed 27712 1727096483.40623: getting the remaining hosts for this loop 27712 1727096483.40624: done getting the remaining hosts for this loop 27712 1727096483.40628: getting the next task for host managed_node2 27712 1727096483.40636: done getting next task for host managed_node2 27712 1727096483.40638: ^ task is: TASK: Show current_interfaces 27712 1727096483.40642: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096483.40645: getting variables 27712 1727096483.40647: in VariableManager get_vars() 27712 1727096483.40691: Calling all_inventory to load vars for managed_node2 27712 1727096483.40694: Calling groups_inventory to load vars for managed_node2 27712 1727096483.40697: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.40708: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.40710: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.40713: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.41049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.41318: done with get_vars() 27712 1727096483.41327: done getting variables 27712 1727096483.41394: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 09:01:23 -0400 (0:00:00.040) 0:00:09.107 ****** 27712 1727096483.41420: entering _queue_task() for managed_node2/debug 27712 1727096483.41660: worker is 1 (out of 1 available) 27712 1727096483.41675: exiting _queue_task() for managed_node2/debug 27712 1727096483.41687: done queuing things up, now waiting for results queue to drain 27712 1727096483.41688: waiting for pending results... 27712 1727096483.41986: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 27712 1727096483.42024: in run() - task 0afff68d-5257-cbc7-8716-000000000283 27712 1727096483.42044: variable 'ansible_search_path' from source: unknown 27712 1727096483.42092: variable 'ansible_search_path' from source: unknown 27712 1727096483.42097: calling self._execute() 27712 1727096483.42181: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.42197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.42212: variable 'omit' from source: magic vars 27712 1727096483.42566: variable 'ansible_distribution_major_version' from source: facts 27712 1727096483.42585: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096483.42636: variable 'omit' from source: magic vars 27712 1727096483.42639: variable 'omit' from source: magic vars 27712 1727096483.42734: variable 'current_interfaces' from source: set_fact 27712 1727096483.42772: variable 'omit' from source: magic vars 27712 1727096483.42814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096483.42872: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096483.42897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096483.42928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096483.43174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096483.43177: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096483.43179: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.43181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.43183: Set connection var ansible_connection to ssh 27712 1727096483.43185: Set connection var ansible_pipelining to False 27712 1727096483.43186: Set connection var ansible_timeout to 10 27712 1727096483.43188: Set connection var ansible_shell_type to sh 27712 1727096483.43190: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096483.43525: Set connection var ansible_shell_executable to /bin/sh 27712 1727096483.43529: variable 'ansible_shell_executable' from source: unknown 27712 1727096483.43532: variable 'ansible_connection' from source: unknown 27712 1727096483.43534: variable 'ansible_module_compression' from source: unknown 27712 1727096483.43536: variable 'ansible_shell_type' from source: unknown 27712 1727096483.43538: variable 'ansible_shell_executable' from source: unknown 27712 1727096483.43539: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.43541: variable 'ansible_pipelining' from source: unknown 27712 1727096483.43543: variable 'ansible_timeout' from source: unknown 27712 1727096483.43545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.43744: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096483.43778: variable 'omit' from source: magic vars 27712 1727096483.43783: starting attempt loop 27712 1727096483.43786: running the handler 27712 1727096483.43853: handler run complete 27712 1727096483.43867: attempt loop complete, returning result 27712 1727096483.43871: _execute() done 27712 1727096483.43877: dumping result to json 27712 1727096483.43880: done dumping result, returning 27712 1727096483.43887: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0afff68d-5257-cbc7-8716-000000000283] 27712 1727096483.43891: sending task result for task 0afff68d-5257-cbc7-8716-000000000283 ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'ethtest0', 'lo', 'peerethtest0'] 27712 1727096483.44034: no more pending results, returning what we have 27712 1727096483.44036: results queue empty 27712 1727096483.44037: checking for any_errors_fatal 27712 1727096483.44043: done checking for any_errors_fatal 27712 1727096483.44044: checking for max_fail_percentage 27712 1727096483.44046: done checking for max_fail_percentage 27712 1727096483.44047: checking to see if all hosts have failed and the running result is not ok 27712 1727096483.44048: done checking to see if all hosts have failed 27712 1727096483.44048: getting the remaining hosts for this loop 27712 1727096483.44050: done getting the remaining hosts for this loop 27712 1727096483.44053: getting the next task for host managed_node2 27712 1727096483.44060: done getting next task for host managed_node2 27712 1727096483.44069: ^ task is: TASK: Manage test interface 27712 1727096483.44072: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096483.44075: getting variables 27712 1727096483.44076: in VariableManager get_vars() 27712 1727096483.44111: Calling all_inventory to load vars for managed_node2 27712 1727096483.44114: Calling groups_inventory to load vars for managed_node2 27712 1727096483.44115: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.44124: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.44126: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.44128: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.44331: done sending task result for task 0afff68d-5257-cbc7-8716-000000000283 27712 1727096483.44335: WORKER PROCESS EXITING 27712 1727096483.44347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.44538: done with get_vars() 27712 1727096483.44546: done getting variables TASK [Manage test interface] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:28 Monday 23 September 2024 09:01:23 -0400 (0:00:00.031) 0:00:09.139 ****** 27712 1727096483.44612: entering _queue_task() for managed_node2/include_tasks 27712 1727096483.44799: worker is 1 (out of 1 available) 27712 1727096483.44812: exiting _queue_task() for managed_node2/include_tasks 27712 1727096483.44826: done queuing things up, now waiting for results queue to drain 27712 1727096483.44827: waiting for pending results... 27712 1727096483.44985: running TaskExecutor() for managed_node2/TASK: Manage test interface 27712 1727096483.45037: in run() - task 0afff68d-5257-cbc7-8716-000000000011 27712 1727096483.45055: variable 'ansible_search_path' from source: unknown 27712 1727096483.45082: calling self._execute() 27712 1727096483.45143: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.45148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.45157: variable 'omit' from source: magic vars 27712 1727096483.45695: variable 'ansible_distribution_major_version' from source: facts 27712 1727096483.45699: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096483.45701: _execute() done 27712 1727096483.45703: dumping result to json 27712 1727096483.45705: done dumping result, returning 27712 1727096483.45708: done running TaskExecutor() for managed_node2/TASK: Manage test interface [0afff68d-5257-cbc7-8716-000000000011] 27712 1727096483.45710: sending task result for task 0afff68d-5257-cbc7-8716-000000000011 27712 1727096483.45782: done sending task result for task 0afff68d-5257-cbc7-8716-000000000011 27712 1727096483.45785: WORKER PROCESS EXITING 27712 1727096483.45823: no more pending results, returning what we have 27712 1727096483.45828: in VariableManager get_vars() 27712 1727096483.45885: Calling all_inventory to load vars for managed_node2 27712 1727096483.45889: Calling groups_inventory to load vars for managed_node2 27712 1727096483.45892: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.45907: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.45911: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.45914: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.46544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.46989: done with get_vars() 27712 1727096483.47080: variable 'ansible_search_path' from source: unknown 27712 1727096483.47094: we have included files to process 27712 1727096483.47095: generating all_blocks data 27712 1727096483.47096: done generating all_blocks data 27712 1727096483.47100: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27712 1727096483.47101: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27712 1727096483.47104: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 27712 1727096483.47749: in VariableManager get_vars() 27712 1727096483.47952: done with get_vars() 27712 1727096483.48730: done processing included file 27712 1727096483.48733: iterating over new_blocks loaded from include file 27712 1727096483.48734: in VariableManager get_vars() 27712 1727096483.48754: done with get_vars() 27712 1727096483.48755: filtering new block on tags 27712 1727096483.48825: done filtering new block on tags 27712 1727096483.48829: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 27712 1727096483.48834: extending task lists for all hosts with included blocks 27712 1727096483.50131: done extending task lists 27712 1727096483.50132: done processing included files 27712 1727096483.50133: results queue empty 27712 1727096483.50134: checking for any_errors_fatal 27712 1727096483.50137: done checking for any_errors_fatal 27712 1727096483.50138: checking for max_fail_percentage 27712 1727096483.50139: done checking for max_fail_percentage 27712 1727096483.50140: checking to see if all hosts have failed and the running result is not ok 27712 1727096483.50140: done checking to see if all hosts have failed 27712 1727096483.50141: getting the remaining hosts for this loop 27712 1727096483.50142: done getting the remaining hosts for this loop 27712 1727096483.50145: getting the next task for host managed_node2 27712 1727096483.50149: done getting next task for host managed_node2 27712 1727096483.50151: ^ task is: TASK: Ensure state in ["present", "absent"] 27712 1727096483.50154: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096483.50156: getting variables 27712 1727096483.50157: in VariableManager get_vars() 27712 1727096483.50182: Calling all_inventory to load vars for managed_node2 27712 1727096483.50184: Calling groups_inventory to load vars for managed_node2 27712 1727096483.50187: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.50192: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.50194: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.50197: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.50463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.50966: done with get_vars() 27712 1727096483.50980: done getting variables 27712 1727096483.51017: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Monday 23 September 2024 09:01:23 -0400 (0:00:00.064) 0:00:09.203 ****** 27712 1727096483.51166: entering _queue_task() for managed_node2/fail 27712 1727096483.51728: worker is 1 (out of 1 available) 27712 1727096483.51743: exiting _queue_task() for managed_node2/fail 27712 1727096483.51755: done queuing things up, now waiting for results queue to drain 27712 1727096483.51759: waiting for pending results... 27712 1727096483.52187: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 27712 1727096483.52191: in run() - task 0afff68d-5257-cbc7-8716-0000000002fc 27712 1727096483.52195: variable 'ansible_search_path' from source: unknown 27712 1727096483.52197: variable 'ansible_search_path' from source: unknown 27712 1727096483.52199: calling self._execute() 27712 1727096483.52373: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.52377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.52380: variable 'omit' from source: magic vars 27712 1727096483.52671: variable 'ansible_distribution_major_version' from source: facts 27712 1727096483.52689: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096483.53161: variable 'state' from source: include params 27712 1727096483.53183: Evaluated conditional (state not in ["present", "absent"]): False 27712 1727096483.53196: when evaluation is False, skipping this task 27712 1727096483.53232: _execute() done 27712 1727096483.53239: dumping result to json 27712 1727096483.53246: done dumping result, returning 27712 1727096483.53256: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [0afff68d-5257-cbc7-8716-0000000002fc] 27712 1727096483.53279: sending task result for task 0afff68d-5257-cbc7-8716-0000000002fc 27712 1727096483.53402: done sending task result for task 0afff68d-5257-cbc7-8716-0000000002fc 27712 1727096483.53405: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 27712 1727096483.53488: no more pending results, returning what we have 27712 1727096483.53492: results queue empty 27712 1727096483.53493: checking for any_errors_fatal 27712 1727096483.53496: done checking for any_errors_fatal 27712 1727096483.53497: checking for max_fail_percentage 27712 1727096483.53498: done checking for max_fail_percentage 27712 1727096483.53499: checking to see if all hosts have failed and the running result is not ok 27712 1727096483.53500: done checking to see if all hosts have failed 27712 1727096483.53501: getting the remaining hosts for this loop 27712 1727096483.53502: done getting the remaining hosts for this loop 27712 1727096483.53506: getting the next task for host managed_node2 27712 1727096483.53512: done getting next task for host managed_node2 27712 1727096483.53516: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 27712 1727096483.53520: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096483.53524: getting variables 27712 1727096483.53526: in VariableManager get_vars() 27712 1727096483.53575: Calling all_inventory to load vars for managed_node2 27712 1727096483.53578: Calling groups_inventory to load vars for managed_node2 27712 1727096483.53580: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.53594: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.53597: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.53599: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.53828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.54137: done with get_vars() 27712 1727096483.54147: done getting variables 27712 1727096483.54211: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Monday 23 September 2024 09:01:23 -0400 (0:00:00.031) 0:00:09.235 ****** 27712 1727096483.54239: entering _queue_task() for managed_node2/fail 27712 1727096483.54483: worker is 1 (out of 1 available) 27712 1727096483.54528: exiting _queue_task() for managed_node2/fail 27712 1727096483.54541: done queuing things up, now waiting for results queue to drain 27712 1727096483.54542: waiting for pending results... 27712 1727096483.54760: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 27712 1727096483.54824: in run() - task 0afff68d-5257-cbc7-8716-0000000002fd 27712 1727096483.54836: variable 'ansible_search_path' from source: unknown 27712 1727096483.54843: variable 'ansible_search_path' from source: unknown 27712 1727096483.54875: calling self._execute() 27712 1727096483.54938: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.54946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.54953: variable 'omit' from source: magic vars 27712 1727096483.55472: variable 'ansible_distribution_major_version' from source: facts 27712 1727096483.55476: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096483.55479: variable 'type' from source: set_fact 27712 1727096483.55481: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 27712 1727096483.55483: when evaluation is False, skipping this task 27712 1727096483.55485: _execute() done 27712 1727096483.55488: dumping result to json 27712 1727096483.55490: done dumping result, returning 27712 1727096483.55496: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [0afff68d-5257-cbc7-8716-0000000002fd] 27712 1727096483.55506: sending task result for task 0afff68d-5257-cbc7-8716-0000000002fd skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 27712 1727096483.55644: no more pending results, returning what we have 27712 1727096483.55647: results queue empty 27712 1727096483.55648: checking for any_errors_fatal 27712 1727096483.55654: done checking for any_errors_fatal 27712 1727096483.55654: checking for max_fail_percentage 27712 1727096483.55656: done checking for max_fail_percentage 27712 1727096483.55656: checking to see if all hosts have failed and the running result is not ok 27712 1727096483.55657: done checking to see if all hosts have failed 27712 1727096483.55658: getting the remaining hosts for this loop 27712 1727096483.55659: done getting the remaining hosts for this loop 27712 1727096483.55662: getting the next task for host managed_node2 27712 1727096483.55673: done getting next task for host managed_node2 27712 1727096483.55675: ^ task is: TASK: Include the task 'show_interfaces.yml' 27712 1727096483.55679: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096483.55683: getting variables 27712 1727096483.55684: in VariableManager get_vars() 27712 1727096483.55723: Calling all_inventory to load vars for managed_node2 27712 1727096483.55726: Calling groups_inventory to load vars for managed_node2 27712 1727096483.55729: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.55739: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.55742: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.55745: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.55982: done sending task result for task 0afff68d-5257-cbc7-8716-0000000002fd 27712 1727096483.55985: WORKER PROCESS EXITING 27712 1727096483.56005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.56187: done with get_vars() 27712 1727096483.56198: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Monday 23 September 2024 09:01:23 -0400 (0:00:00.020) 0:00:09.255 ****** 27712 1727096483.56263: entering _queue_task() for managed_node2/include_tasks 27712 1727096483.56450: worker is 1 (out of 1 available) 27712 1727096483.56463: exiting _queue_task() for managed_node2/include_tasks 27712 1727096483.56478: done queuing things up, now waiting for results queue to drain 27712 1727096483.56480: waiting for pending results... 27712 1727096483.56671: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 27712 1727096483.56778: in run() - task 0afff68d-5257-cbc7-8716-0000000002fe 27712 1727096483.56800: variable 'ansible_search_path' from source: unknown 27712 1727096483.56808: variable 'ansible_search_path' from source: unknown 27712 1727096483.56848: calling self._execute() 27712 1727096483.56936: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.56949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.56963: variable 'omit' from source: magic vars 27712 1727096483.57319: variable 'ansible_distribution_major_version' from source: facts 27712 1727096483.57337: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096483.57348: _execute() done 27712 1727096483.57355: dumping result to json 27712 1727096483.57413: done dumping result, returning 27712 1727096483.57417: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-cbc7-8716-0000000002fe] 27712 1727096483.57419: sending task result for task 0afff68d-5257-cbc7-8716-0000000002fe 27712 1727096483.57536: no more pending results, returning what we have 27712 1727096483.57541: in VariableManager get_vars() 27712 1727096483.57587: Calling all_inventory to load vars for managed_node2 27712 1727096483.57590: Calling groups_inventory to load vars for managed_node2 27712 1727096483.57592: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.57603: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.57606: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.57608: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.57789: done sending task result for task 0afff68d-5257-cbc7-8716-0000000002fe 27712 1727096483.57792: WORKER PROCESS EXITING 27712 1727096483.57797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.57910: done with get_vars() 27712 1727096483.57915: variable 'ansible_search_path' from source: unknown 27712 1727096483.57916: variable 'ansible_search_path' from source: unknown 27712 1727096483.57940: we have included files to process 27712 1727096483.57941: generating all_blocks data 27712 1727096483.57942: done generating all_blocks data 27712 1727096483.57945: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27712 1727096483.57945: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27712 1727096483.57947: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 27712 1727096483.58014: in VariableManager get_vars() 27712 1727096483.58029: done with get_vars() 27712 1727096483.58102: done processing included file 27712 1727096483.58104: iterating over new_blocks loaded from include file 27712 1727096483.58105: in VariableManager get_vars() 27712 1727096483.58118: done with get_vars() 27712 1727096483.58120: filtering new block on tags 27712 1727096483.58131: done filtering new block on tags 27712 1727096483.58132: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 27712 1727096483.58136: extending task lists for all hosts with included blocks 27712 1727096483.58354: done extending task lists 27712 1727096483.58355: done processing included files 27712 1727096483.58355: results queue empty 27712 1727096483.58356: checking for any_errors_fatal 27712 1727096483.58357: done checking for any_errors_fatal 27712 1727096483.58358: checking for max_fail_percentage 27712 1727096483.58359: done checking for max_fail_percentage 27712 1727096483.58359: checking to see if all hosts have failed and the running result is not ok 27712 1727096483.58359: done checking to see if all hosts have failed 27712 1727096483.58360: getting the remaining hosts for this loop 27712 1727096483.58361: done getting the remaining hosts for this loop 27712 1727096483.58362: getting the next task for host managed_node2 27712 1727096483.58365: done getting next task for host managed_node2 27712 1727096483.58366: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 27712 1727096483.58371: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096483.58373: getting variables 27712 1727096483.58374: in VariableManager get_vars() 27712 1727096483.58382: Calling all_inventory to load vars for managed_node2 27712 1727096483.58383: Calling groups_inventory to load vars for managed_node2 27712 1727096483.58384: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.58388: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.58389: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.58391: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.58652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.58756: done with get_vars() 27712 1727096483.58764: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 09:01:23 -0400 (0:00:00.025) 0:00:09.281 ****** 27712 1727096483.58813: entering _queue_task() for managed_node2/include_tasks 27712 1727096483.58990: worker is 1 (out of 1 available) 27712 1727096483.59001: exiting _queue_task() for managed_node2/include_tasks 27712 1727096483.59011: done queuing things up, now waiting for results queue to drain 27712 1727096483.59012: waiting for pending results... 27712 1727096483.59167: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 27712 1727096483.59245: in run() - task 0afff68d-5257-cbc7-8716-000000000374 27712 1727096483.59254: variable 'ansible_search_path' from source: unknown 27712 1727096483.59258: variable 'ansible_search_path' from source: unknown 27712 1727096483.59288: calling self._execute() 27712 1727096483.59351: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.59360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.59370: variable 'omit' from source: magic vars 27712 1727096483.59629: variable 'ansible_distribution_major_version' from source: facts 27712 1727096483.59639: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096483.59644: _execute() done 27712 1727096483.59647: dumping result to json 27712 1727096483.59652: done dumping result, returning 27712 1727096483.59657: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-cbc7-8716-000000000374] 27712 1727096483.59662: sending task result for task 0afff68d-5257-cbc7-8716-000000000374 27712 1727096483.59747: done sending task result for task 0afff68d-5257-cbc7-8716-000000000374 27712 1727096483.59750: WORKER PROCESS EXITING 27712 1727096483.59802: no more pending results, returning what we have 27712 1727096483.59806: in VariableManager get_vars() 27712 1727096483.59842: Calling all_inventory to load vars for managed_node2 27712 1727096483.59845: Calling groups_inventory to load vars for managed_node2 27712 1727096483.59847: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.59856: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.59859: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.59861: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.59974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.60091: done with get_vars() 27712 1727096483.60096: variable 'ansible_search_path' from source: unknown 27712 1727096483.60097: variable 'ansible_search_path' from source: unknown 27712 1727096483.60134: we have included files to process 27712 1727096483.60134: generating all_blocks data 27712 1727096483.60136: done generating all_blocks data 27712 1727096483.60137: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27712 1727096483.60137: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27712 1727096483.60139: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 27712 1727096483.60308: done processing included file 27712 1727096483.60310: iterating over new_blocks loaded from include file 27712 1727096483.60311: in VariableManager get_vars() 27712 1727096483.60326: done with get_vars() 27712 1727096483.60327: filtering new block on tags 27712 1727096483.60339: done filtering new block on tags 27712 1727096483.60340: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 27712 1727096483.60343: extending task lists for all hosts with included blocks 27712 1727096483.60429: done extending task lists 27712 1727096483.60431: done processing included files 27712 1727096483.60431: results queue empty 27712 1727096483.60432: checking for any_errors_fatal 27712 1727096483.60434: done checking for any_errors_fatal 27712 1727096483.60434: checking for max_fail_percentage 27712 1727096483.60435: done checking for max_fail_percentage 27712 1727096483.60435: checking to see if all hosts have failed and the running result is not ok 27712 1727096483.60436: done checking to see if all hosts have failed 27712 1727096483.60436: getting the remaining hosts for this loop 27712 1727096483.60437: done getting the remaining hosts for this loop 27712 1727096483.60439: getting the next task for host managed_node2 27712 1727096483.60442: done getting next task for host managed_node2 27712 1727096483.60443: ^ task is: TASK: Gather current interface info 27712 1727096483.60446: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096483.60447: getting variables 27712 1727096483.60448: in VariableManager get_vars() 27712 1727096483.60456: Calling all_inventory to load vars for managed_node2 27712 1727096483.60457: Calling groups_inventory to load vars for managed_node2 27712 1727096483.60458: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.60462: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.60463: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.60464: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.60575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.60683: done with get_vars() 27712 1727096483.60690: done getting variables 27712 1727096483.60715: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 09:01:23 -0400 (0:00:00.019) 0:00:09.300 ****** 27712 1727096483.60733: entering _queue_task() for managed_node2/command 27712 1727096483.60918: worker is 1 (out of 1 available) 27712 1727096483.60930: exiting _queue_task() for managed_node2/command 27712 1727096483.60942: done queuing things up, now waiting for results queue to drain 27712 1727096483.60943: waiting for pending results... 27712 1727096483.61098: running TaskExecutor() for managed_node2/TASK: Gather current interface info 27712 1727096483.61168: in run() - task 0afff68d-5257-cbc7-8716-0000000003ab 27712 1727096483.61181: variable 'ansible_search_path' from source: unknown 27712 1727096483.61186: variable 'ansible_search_path' from source: unknown 27712 1727096483.61213: calling self._execute() 27712 1727096483.61281: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.61284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.61293: variable 'omit' from source: magic vars 27712 1727096483.61558: variable 'ansible_distribution_major_version' from source: facts 27712 1727096483.61569: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096483.61577: variable 'omit' from source: magic vars 27712 1727096483.61608: variable 'omit' from source: magic vars 27712 1727096483.61633: variable 'omit' from source: magic vars 27712 1727096483.61665: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096483.61695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096483.61711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096483.61725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096483.61735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096483.61756: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096483.61761: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.61763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.61875: Set connection var ansible_connection to ssh 27712 1727096483.61879: Set connection var ansible_pipelining to False 27712 1727096483.61883: Set connection var ansible_timeout to 10 27712 1727096483.61886: Set connection var ansible_shell_type to sh 27712 1727096483.61888: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096483.61890: Set connection var ansible_shell_executable to /bin/sh 27712 1727096483.61892: variable 'ansible_shell_executable' from source: unknown 27712 1727096483.61894: variable 'ansible_connection' from source: unknown 27712 1727096483.61897: variable 'ansible_module_compression' from source: unknown 27712 1727096483.61899: variable 'ansible_shell_type' from source: unknown 27712 1727096483.61901: variable 'ansible_shell_executable' from source: unknown 27712 1727096483.61903: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.61905: variable 'ansible_pipelining' from source: unknown 27712 1727096483.61908: variable 'ansible_timeout' from source: unknown 27712 1727096483.61911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.61988: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096483.61997: variable 'omit' from source: magic vars 27712 1727096483.62002: starting attempt loop 27712 1727096483.62005: running the handler 27712 1727096483.62017: _low_level_execute_command(): starting 27712 1727096483.62025: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096483.62547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096483.62551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.62554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096483.62556: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.62612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096483.62619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096483.62622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096483.62658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096483.64331: stdout chunk (state=3): >>>/root <<< 27712 1727096483.64425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096483.64456: stderr chunk (state=3): >>><<< 27712 1727096483.64459: stdout chunk (state=3): >>><<< 27712 1727096483.64484: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096483.64499: _low_level_execute_command(): starting 27712 1727096483.64506: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096483.644851-28283-141016790966987 `" && echo ansible-tmp-1727096483.644851-28283-141016790966987="` echo /root/.ansible/tmp/ansible-tmp-1727096483.644851-28283-141016790966987 `" ) && sleep 0' 27712 1727096483.64961: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096483.64964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.64977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096483.64979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096483.64981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.65025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096483.65028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096483.65032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096483.65072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096483.67008: stdout chunk (state=3): >>>ansible-tmp-1727096483.644851-28283-141016790966987=/root/.ansible/tmp/ansible-tmp-1727096483.644851-28283-141016790966987 <<< 27712 1727096483.67113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096483.67143: stderr chunk (state=3): >>><<< 27712 1727096483.67146: stdout chunk (state=3): >>><<< 27712 1727096483.67162: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096483.644851-28283-141016790966987=/root/.ansible/tmp/ansible-tmp-1727096483.644851-28283-141016790966987 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096483.67193: variable 'ansible_module_compression' from source: unknown 27712 1727096483.67239: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096483.67273: variable 'ansible_facts' from source: unknown 27712 1727096483.67328: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096483.644851-28283-141016790966987/AnsiballZ_command.py 27712 1727096483.67431: Sending initial data 27712 1727096483.67435: Sent initial data (155 bytes) 27712 1727096483.67892: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096483.67895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096483.67898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.67900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096483.67902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.67954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096483.67957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096483.67961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096483.67995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096483.69599: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096483.69630: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096483.69658: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmp4y45b9u8 /root/.ansible/tmp/ansible-tmp-1727096483.644851-28283-141016790966987/AnsiballZ_command.py <<< 27712 1727096483.69670: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096483.644851-28283-141016790966987/AnsiballZ_command.py" <<< 27712 1727096483.69692: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmp4y45b9u8" to remote "/root/.ansible/tmp/ansible-tmp-1727096483.644851-28283-141016790966987/AnsiballZ_command.py" <<< 27712 1727096483.69698: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096483.644851-28283-141016790966987/AnsiballZ_command.py" <<< 27712 1727096483.70187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096483.70228: stderr chunk (state=3): >>><<< 27712 1727096483.70231: stdout chunk (state=3): >>><<< 27712 1727096483.70254: done transferring module to remote 27712 1727096483.70263: _low_level_execute_command(): starting 27712 1727096483.70272: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096483.644851-28283-141016790966987/ /root/.ansible/tmp/ansible-tmp-1727096483.644851-28283-141016790966987/AnsiballZ_command.py && sleep 0' 27712 1727096483.70719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096483.70722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096483.70725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.70727: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096483.70735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.70786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096483.70789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096483.70791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096483.70825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096483.72619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096483.72645: stderr chunk (state=3): >>><<< 27712 1727096483.72649: stdout chunk (state=3): >>><<< 27712 1727096483.72663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096483.72666: _low_level_execute_command(): starting 27712 1727096483.72680: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096483.644851-28283-141016790966987/AnsiballZ_command.py && sleep 0' 27712 1727096483.73123: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096483.73127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.73129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096483.73131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.73187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096483.73190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096483.73234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096483.89351: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:01:23.888565", "end": "2024-09-23 09:01:23.891953", "delta": "0:00:00.003388", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096483.91260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096483.91265: stdout chunk (state=3): >>><<< 27712 1727096483.91289: stderr chunk (state=3): >>><<< 27712 1727096483.91366: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 09:01:23.888565", "end": "2024-09-23 09:01:23.891953", "delta": "0:00:00.003388", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096483.91377: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096483.644851-28283-141016790966987/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096483.91403: _low_level_execute_command(): starting 27712 1727096483.91415: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096483.644851-28283-141016790966987/ > /dev/null 2>&1 && sleep 0' 27712 1727096483.92043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096483.92061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096483.92083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096483.92122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096483.92160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27712 1727096483.92178: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096483.92256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096483.92291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096483.92315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096483.92385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096483.94297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096483.94304: stdout chunk (state=3): >>><<< 27712 1727096483.94310: stderr chunk (state=3): >>><<< 27712 1727096483.94375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096483.94378: handler run complete 27712 1727096483.94380: Evaluated conditional (False): False 27712 1727096483.94382: attempt loop complete, returning result 27712 1727096483.94388: _execute() done 27712 1727096483.94394: dumping result to json 27712 1727096483.94405: done dumping result, returning 27712 1727096483.94420: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0afff68d-5257-cbc7-8716-0000000003ab] 27712 1727096483.94430: sending task result for task 0afff68d-5257-cbc7-8716-0000000003ab ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003388", "end": "2024-09-23 09:01:23.891953", "rc": 0, "start": "2024-09-23 09:01:23.888565" } STDOUT: bonding_masters eth0 ethtest0 lo peerethtest0 27712 1727096483.94631: no more pending results, returning what we have 27712 1727096483.94635: results queue empty 27712 1727096483.94636: checking for any_errors_fatal 27712 1727096483.94637: done checking for any_errors_fatal 27712 1727096483.94638: checking for max_fail_percentage 27712 1727096483.94639: done checking for max_fail_percentage 27712 1727096483.94640: checking to see if all hosts have failed and the running result is not ok 27712 1727096483.94641: done checking to see if all hosts have failed 27712 1727096483.94642: getting the remaining hosts for this loop 27712 1727096483.94643: done getting the remaining hosts for this loop 27712 1727096483.94648: getting the next task for host managed_node2 27712 1727096483.94656: done getting next task for host managed_node2 27712 1727096483.94660: ^ task is: TASK: Set current_interfaces 27712 1727096483.94668: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096483.94675: getting variables 27712 1727096483.94677: in VariableManager get_vars() 27712 1727096483.94720: Calling all_inventory to load vars for managed_node2 27712 1727096483.94724: Calling groups_inventory to load vars for managed_node2 27712 1727096483.94726: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.94738: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.94741: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.94743: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.95272: done sending task result for task 0afff68d-5257-cbc7-8716-0000000003ab 27712 1727096483.95276: WORKER PROCESS EXITING 27712 1727096483.95300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.95618: done with get_vars() 27712 1727096483.95649: done getting variables 27712 1727096483.95711: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 09:01:23 -0400 (0:00:00.350) 0:00:09.650 ****** 27712 1727096483.95757: entering _queue_task() for managed_node2/set_fact 27712 1727096483.96090: worker is 1 (out of 1 available) 27712 1727096483.96103: exiting _queue_task() for managed_node2/set_fact 27712 1727096483.96117: done queuing things up, now waiting for results queue to drain 27712 1727096483.96118: waiting for pending results... 27712 1727096483.96355: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 27712 1727096483.96545: in run() - task 0afff68d-5257-cbc7-8716-0000000003ac 27712 1727096483.96588: variable 'ansible_search_path' from source: unknown 27712 1727096483.96779: variable 'ansible_search_path' from source: unknown 27712 1727096483.96820: calling self._execute() 27712 1727096483.96913: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.96927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.96940: variable 'omit' from source: magic vars 27712 1727096483.97300: variable 'ansible_distribution_major_version' from source: facts 27712 1727096483.97316: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096483.97326: variable 'omit' from source: magic vars 27712 1727096483.97389: variable 'omit' from source: magic vars 27712 1727096483.97507: variable '_current_interfaces' from source: set_fact 27712 1727096483.97579: variable 'omit' from source: magic vars 27712 1727096483.97628: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096483.97669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096483.97696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096483.97725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096483.97829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096483.97833: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096483.97835: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.97837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.97910: Set connection var ansible_connection to ssh 27712 1727096483.97926: Set connection var ansible_pipelining to False 27712 1727096483.97946: Set connection var ansible_timeout to 10 27712 1727096483.97952: Set connection var ansible_shell_type to sh 27712 1727096483.97963: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096483.97976: Set connection var ansible_shell_executable to /bin/sh 27712 1727096483.98001: variable 'ansible_shell_executable' from source: unknown 27712 1727096483.98008: variable 'ansible_connection' from source: unknown 27712 1727096483.98014: variable 'ansible_module_compression' from source: unknown 27712 1727096483.98020: variable 'ansible_shell_type' from source: unknown 27712 1727096483.98025: variable 'ansible_shell_executable' from source: unknown 27712 1727096483.98031: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096483.98042: variable 'ansible_pipelining' from source: unknown 27712 1727096483.98052: variable 'ansible_timeout' from source: unknown 27712 1727096483.98060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096483.98209: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096483.98264: variable 'omit' from source: magic vars 27712 1727096483.98268: starting attempt loop 27712 1727096483.98274: running the handler 27712 1727096483.98276: handler run complete 27712 1727096483.98278: attempt loop complete, returning result 27712 1727096483.98280: _execute() done 27712 1727096483.98294: dumping result to json 27712 1727096483.98301: done dumping result, returning 27712 1727096483.98312: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0afff68d-5257-cbc7-8716-0000000003ac] 27712 1727096483.98380: sending task result for task 0afff68d-5257-cbc7-8716-0000000003ac 27712 1727096483.98448: done sending task result for task 0afff68d-5257-cbc7-8716-0000000003ac 27712 1727096483.98452: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "ethtest0", "lo", "peerethtest0" ] }, "changed": false } 27712 1727096483.98541: no more pending results, returning what we have 27712 1727096483.98544: results queue empty 27712 1727096483.98545: checking for any_errors_fatal 27712 1727096483.98555: done checking for any_errors_fatal 27712 1727096483.98556: checking for max_fail_percentage 27712 1727096483.98558: done checking for max_fail_percentage 27712 1727096483.98560: checking to see if all hosts have failed and the running result is not ok 27712 1727096483.98561: done checking to see if all hosts have failed 27712 1727096483.98561: getting the remaining hosts for this loop 27712 1727096483.98563: done getting the remaining hosts for this loop 27712 1727096483.98567: getting the next task for host managed_node2 27712 1727096483.98580: done getting next task for host managed_node2 27712 1727096483.98582: ^ task is: TASK: Show current_interfaces 27712 1727096483.98588: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096483.98777: getting variables 27712 1727096483.98778: in VariableManager get_vars() 27712 1727096483.98814: Calling all_inventory to load vars for managed_node2 27712 1727096483.98817: Calling groups_inventory to load vars for managed_node2 27712 1727096483.98819: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096483.98828: Calling all_plugins_play to load vars for managed_node2 27712 1727096483.98831: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096483.98834: Calling groups_plugins_play to load vars for managed_node2 27712 1727096483.99099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096483.99320: done with get_vars() 27712 1727096483.99332: done getting variables 27712 1727096483.99393: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 09:01:23 -0400 (0:00:00.036) 0:00:09.687 ****** 27712 1727096483.99430: entering _queue_task() for managed_node2/debug 27712 1727096483.99721: worker is 1 (out of 1 available) 27712 1727096483.99733: exiting _queue_task() for managed_node2/debug 27712 1727096483.99859: done queuing things up, now waiting for results queue to drain 27712 1727096483.99861: waiting for pending results... 27712 1727096484.00091: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 27712 1727096484.00187: in run() - task 0afff68d-5257-cbc7-8716-000000000375 27712 1727096484.00191: variable 'ansible_search_path' from source: unknown 27712 1727096484.00195: variable 'ansible_search_path' from source: unknown 27712 1727096484.00295: calling self._execute() 27712 1727096484.00335: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096484.00347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096484.00360: variable 'omit' from source: magic vars 27712 1727096484.00766: variable 'ansible_distribution_major_version' from source: facts 27712 1727096484.00790: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096484.00802: variable 'omit' from source: magic vars 27712 1727096484.00856: variable 'omit' from source: magic vars 27712 1727096484.00975: variable 'current_interfaces' from source: set_fact 27712 1727096484.01057: variable 'omit' from source: magic vars 27712 1727096484.01061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096484.01107: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096484.01131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096484.01154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096484.01180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096484.01217: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096484.01227: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096484.01235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096484.01348: Set connection var ansible_connection to ssh 27712 1727096484.01385: Set connection var ansible_pipelining to False 27712 1727096484.01388: Set connection var ansible_timeout to 10 27712 1727096484.01390: Set connection var ansible_shell_type to sh 27712 1727096484.01396: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096484.01411: Set connection var ansible_shell_executable to /bin/sh 27712 1727096484.01494: variable 'ansible_shell_executable' from source: unknown 27712 1727096484.01498: variable 'ansible_connection' from source: unknown 27712 1727096484.01500: variable 'ansible_module_compression' from source: unknown 27712 1727096484.01502: variable 'ansible_shell_type' from source: unknown 27712 1727096484.01505: variable 'ansible_shell_executable' from source: unknown 27712 1727096484.01508: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096484.01510: variable 'ansible_pipelining' from source: unknown 27712 1727096484.01512: variable 'ansible_timeout' from source: unknown 27712 1727096484.01514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096484.01643: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096484.01659: variable 'omit' from source: magic vars 27712 1727096484.01673: starting attempt loop 27712 1727096484.01681: running the handler 27712 1727096484.01740: handler run complete 27712 1727096484.01774: attempt loop complete, returning result 27712 1727096484.01778: _execute() done 27712 1727096484.01780: dumping result to json 27712 1727096484.01782: done dumping result, returning 27712 1727096484.01821: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0afff68d-5257-cbc7-8716-000000000375] 27712 1727096484.01824: sending task result for task 0afff68d-5257-cbc7-8716-000000000375 ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'ethtest0', 'lo', 'peerethtest0'] 27712 1727096484.02088: no more pending results, returning what we have 27712 1727096484.02092: results queue empty 27712 1727096484.02093: checking for any_errors_fatal 27712 1727096484.02099: done checking for any_errors_fatal 27712 1727096484.02100: checking for max_fail_percentage 27712 1727096484.02102: done checking for max_fail_percentage 27712 1727096484.02103: checking to see if all hosts have failed and the running result is not ok 27712 1727096484.02104: done checking to see if all hosts have failed 27712 1727096484.02105: getting the remaining hosts for this loop 27712 1727096484.02106: done getting the remaining hosts for this loop 27712 1727096484.02110: getting the next task for host managed_node2 27712 1727096484.02118: done getting next task for host managed_node2 27712 1727096484.02120: ^ task is: TASK: Install iproute 27712 1727096484.02124: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096484.02128: getting variables 27712 1727096484.02129: in VariableManager get_vars() 27712 1727096484.02348: Calling all_inventory to load vars for managed_node2 27712 1727096484.02350: Calling groups_inventory to load vars for managed_node2 27712 1727096484.02353: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096484.02359: done sending task result for task 0afff68d-5257-cbc7-8716-000000000375 27712 1727096484.02361: WORKER PROCESS EXITING 27712 1727096484.02374: Calling all_plugins_play to load vars for managed_node2 27712 1727096484.02377: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096484.02380: Calling groups_plugins_play to load vars for managed_node2 27712 1727096484.02624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096484.02842: done with get_vars() 27712 1727096484.02853: done getting variables 27712 1727096484.02915: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Monday 23 September 2024 09:01:24 -0400 (0:00:00.035) 0:00:09.722 ****** 27712 1727096484.02946: entering _queue_task() for managed_node2/package 27712 1727096484.03253: worker is 1 (out of 1 available) 27712 1727096484.03266: exiting _queue_task() for managed_node2/package 27712 1727096484.03487: done queuing things up, now waiting for results queue to drain 27712 1727096484.03488: waiting for pending results... 27712 1727096484.03544: running TaskExecutor() for managed_node2/TASK: Install iproute 27712 1727096484.03658: in run() - task 0afff68d-5257-cbc7-8716-0000000002ff 27712 1727096484.03684: variable 'ansible_search_path' from source: unknown 27712 1727096484.03696: variable 'ansible_search_path' from source: unknown 27712 1727096484.03740: calling self._execute() 27712 1727096484.03835: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096484.03848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096484.03861: variable 'omit' from source: magic vars 27712 1727096484.04263: variable 'ansible_distribution_major_version' from source: facts 27712 1727096484.04286: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096484.04298: variable 'omit' from source: magic vars 27712 1727096484.04335: variable 'omit' from source: magic vars 27712 1727096484.04567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096484.06781: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096484.06874: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096484.06973: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096484.06976: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096484.06991: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096484.07096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096484.07130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096484.07161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096484.07217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096484.07238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096484.07376: variable '__network_is_ostree' from source: set_fact 27712 1727096484.07379: variable 'omit' from source: magic vars 27712 1727096484.07404: variable 'omit' from source: magic vars 27712 1727096484.07432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096484.07462: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096484.07512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096484.07515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096484.07520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096484.07548: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096484.07556: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096484.07562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096484.07675: Set connection var ansible_connection to ssh 27712 1727096484.07729: Set connection var ansible_pipelining to False 27712 1727096484.07731: Set connection var ansible_timeout to 10 27712 1727096484.07734: Set connection var ansible_shell_type to sh 27712 1727096484.07736: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096484.07738: Set connection var ansible_shell_executable to /bin/sh 27712 1727096484.07739: variable 'ansible_shell_executable' from source: unknown 27712 1727096484.07741: variable 'ansible_connection' from source: unknown 27712 1727096484.07744: variable 'ansible_module_compression' from source: unknown 27712 1727096484.07749: variable 'ansible_shell_type' from source: unknown 27712 1727096484.07755: variable 'ansible_shell_executable' from source: unknown 27712 1727096484.07760: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096484.07766: variable 'ansible_pipelining' from source: unknown 27712 1727096484.07777: variable 'ansible_timeout' from source: unknown 27712 1727096484.07783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096484.07880: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096484.07945: variable 'omit' from source: magic vars 27712 1727096484.07948: starting attempt loop 27712 1727096484.07950: running the handler 27712 1727096484.07952: variable 'ansible_facts' from source: unknown 27712 1727096484.07953: variable 'ansible_facts' from source: unknown 27712 1727096484.07955: _low_level_execute_command(): starting 27712 1727096484.07965: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096484.08694: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096484.08776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096484.08824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096484.08844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096484.08882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096484.08953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096484.10737: stdout chunk (state=3): >>>/root <<< 27712 1727096484.10982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096484.10986: stdout chunk (state=3): >>><<< 27712 1727096484.10994: stderr chunk (state=3): >>><<< 27712 1727096484.11014: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096484.11027: _low_level_execute_command(): starting 27712 1727096484.11033: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096484.1101549-28304-80447740710566 `" && echo ansible-tmp-1727096484.1101549-28304-80447740710566="` echo /root/.ansible/tmp/ansible-tmp-1727096484.1101549-28304-80447740710566 `" ) && sleep 0' 27712 1727096484.11696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096484.11700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096484.11719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096484.11724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096484.11773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096484.11819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096484.11843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096484.11914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096484.13879: stdout chunk (state=3): >>>ansible-tmp-1727096484.1101549-28304-80447740710566=/root/.ansible/tmp/ansible-tmp-1727096484.1101549-28304-80447740710566 <<< 27712 1727096484.14094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096484.14154: stderr chunk (state=3): >>><<< 27712 1727096484.14158: stdout chunk (state=3): >>><<< 27712 1727096484.14257: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096484.1101549-28304-80447740710566=/root/.ansible/tmp/ansible-tmp-1727096484.1101549-28304-80447740710566 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096484.14261: variable 'ansible_module_compression' from source: unknown 27712 1727096484.14408: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 27712 1727096484.14476: variable 'ansible_facts' from source: unknown 27712 1727096484.14628: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096484.1101549-28304-80447740710566/AnsiballZ_dnf.py 27712 1727096484.14802: Sending initial data 27712 1727096484.14805: Sent initial data (151 bytes) 27712 1727096484.15553: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096484.15578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096484.15599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096484.15680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096484.17388: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096484.17393: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096484.17617: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmputx5jip1 /root/.ansible/tmp/ansible-tmp-1727096484.1101549-28304-80447740710566/AnsiballZ_dnf.py <<< 27712 1727096484.17623: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096484.1101549-28304-80447740710566/AnsiballZ_dnf.py" <<< 27712 1727096484.17627: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmputx5jip1" to remote "/root/.ansible/tmp/ansible-tmp-1727096484.1101549-28304-80447740710566/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096484.1101549-28304-80447740710566/AnsiballZ_dnf.py" <<< 27712 1727096484.19200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096484.19373: stderr chunk (state=3): >>><<< 27712 1727096484.19376: stdout chunk (state=3): >>><<< 27712 1727096484.19548: done transferring module to remote 27712 1727096484.19551: _low_level_execute_command(): starting 27712 1727096484.19554: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096484.1101549-28304-80447740710566/ /root/.ansible/tmp/ansible-tmp-1727096484.1101549-28304-80447740710566/AnsiballZ_dnf.py && sleep 0' 27712 1727096484.20671: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096484.20682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096484.21317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096484.21350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096484.23191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096484.23243: stderr chunk (state=3): >>><<< 27712 1727096484.23246: stdout chunk (state=3): >>><<< 27712 1727096484.23270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096484.23273: _low_level_execute_command(): starting 27712 1727096484.23317: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096484.1101549-28304-80447740710566/AnsiballZ_dnf.py && sleep 0' 27712 1727096484.24382: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096484.24486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096484.24497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096484.24511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096484.24573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096484.24583: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096484.24818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096484.24839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096484.24908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096484.66710: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 27712 1727096484.70748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096484.70777: stderr chunk (state=3): >>><<< 27712 1727096484.70781: stdout chunk (state=3): >>><<< 27712 1727096484.70798: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096484.70833: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096484.1101549-28304-80447740710566/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096484.70839: _low_level_execute_command(): starting 27712 1727096484.70844: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096484.1101549-28304-80447740710566/ > /dev/null 2>&1 && sleep 0' 27712 1727096484.71303: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096484.71306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096484.71312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096484.71314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096484.71317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096484.71369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096484.71383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096484.71392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096484.71410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096484.73224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096484.73250: stderr chunk (state=3): >>><<< 27712 1727096484.73253: stdout chunk (state=3): >>><<< 27712 1727096484.73265: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096484.73277: handler run complete 27712 1727096484.73392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096484.73520: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096484.73566: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096484.73593: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096484.73615: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096484.73671: variable '__install_status' from source: set_fact 27712 1727096484.73687: Evaluated conditional (__install_status is success): True 27712 1727096484.73699: attempt loop complete, returning result 27712 1727096484.73702: _execute() done 27712 1727096484.73704: dumping result to json 27712 1727096484.73710: done dumping result, returning 27712 1727096484.73716: done running TaskExecutor() for managed_node2/TASK: Install iproute [0afff68d-5257-cbc7-8716-0000000002ff] 27712 1727096484.73720: sending task result for task 0afff68d-5257-cbc7-8716-0000000002ff 27712 1727096484.73826: done sending task result for task 0afff68d-5257-cbc7-8716-0000000002ff 27712 1727096484.73828: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 27712 1727096484.74006: no more pending results, returning what we have 27712 1727096484.74009: results queue empty 27712 1727096484.74010: checking for any_errors_fatal 27712 1727096484.74015: done checking for any_errors_fatal 27712 1727096484.74015: checking for max_fail_percentage 27712 1727096484.74017: done checking for max_fail_percentage 27712 1727096484.74017: checking to see if all hosts have failed and the running result is not ok 27712 1727096484.74018: done checking to see if all hosts have failed 27712 1727096484.74019: getting the remaining hosts for this loop 27712 1727096484.74020: done getting the remaining hosts for this loop 27712 1727096484.74023: getting the next task for host managed_node2 27712 1727096484.74028: done getting next task for host managed_node2 27712 1727096484.74030: ^ task is: TASK: Create veth interface {{ interface }} 27712 1727096484.74033: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096484.74036: getting variables 27712 1727096484.74037: in VariableManager get_vars() 27712 1727096484.74073: Calling all_inventory to load vars for managed_node2 27712 1727096484.74075: Calling groups_inventory to load vars for managed_node2 27712 1727096484.74077: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096484.74086: Calling all_plugins_play to load vars for managed_node2 27712 1727096484.74089: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096484.74091: Calling groups_plugins_play to load vars for managed_node2 27712 1727096484.74295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096484.74510: done with get_vars() 27712 1727096484.74521: done getting variables 27712 1727096484.74581: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096484.74702: variable 'interface' from source: set_fact TASK [Create veth interface ethtest1] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Monday 23 September 2024 09:01:24 -0400 (0:00:00.717) 0:00:10.440 ****** 27712 1727096484.74738: entering _queue_task() for managed_node2/command 27712 1727096484.75014: worker is 1 (out of 1 available) 27712 1727096484.75027: exiting _queue_task() for managed_node2/command 27712 1727096484.75085: done queuing things up, now waiting for results queue to drain 27712 1727096484.75087: waiting for pending results... 27712 1727096484.75277: running TaskExecutor() for managed_node2/TASK: Create veth interface ethtest1 27712 1727096484.75343: in run() - task 0afff68d-5257-cbc7-8716-000000000300 27712 1727096484.75354: variable 'ansible_search_path' from source: unknown 27712 1727096484.75359: variable 'ansible_search_path' from source: unknown 27712 1727096484.75557: variable 'interface' from source: set_fact 27712 1727096484.75620: variable 'interface' from source: set_fact 27712 1727096484.75671: variable 'interface' from source: set_fact 27712 1727096484.75785: Loaded config def from plugin (lookup/items) 27712 1727096484.75792: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 27712 1727096484.75810: variable 'omit' from source: magic vars 27712 1727096484.75898: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096484.75910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096484.75913: variable 'omit' from source: magic vars 27712 1727096484.76125: variable 'ansible_distribution_major_version' from source: facts 27712 1727096484.76134: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096484.76258: variable 'type' from source: set_fact 27712 1727096484.76261: variable 'state' from source: include params 27712 1727096484.76263: variable 'interface' from source: set_fact 27712 1727096484.76270: variable 'current_interfaces' from source: set_fact 27712 1727096484.76279: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27712 1727096484.76284: variable 'omit' from source: magic vars 27712 1727096484.76308: variable 'omit' from source: magic vars 27712 1727096484.76335: variable 'item' from source: unknown 27712 1727096484.76388: variable 'item' from source: unknown 27712 1727096484.76401: variable 'omit' from source: magic vars 27712 1727096484.76423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096484.76445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096484.76460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096484.76478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096484.76486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096484.76510: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096484.76513: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096484.76516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096484.76587: Set connection var ansible_connection to ssh 27712 1727096484.76593: Set connection var ansible_pipelining to False 27712 1727096484.76598: Set connection var ansible_timeout to 10 27712 1727096484.76601: Set connection var ansible_shell_type to sh 27712 1727096484.76607: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096484.76614: Set connection var ansible_shell_executable to /bin/sh 27712 1727096484.76628: variable 'ansible_shell_executable' from source: unknown 27712 1727096484.76631: variable 'ansible_connection' from source: unknown 27712 1727096484.76634: variable 'ansible_module_compression' from source: unknown 27712 1727096484.76636: variable 'ansible_shell_type' from source: unknown 27712 1727096484.76638: variable 'ansible_shell_executable' from source: unknown 27712 1727096484.76640: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096484.76644: variable 'ansible_pipelining' from source: unknown 27712 1727096484.76647: variable 'ansible_timeout' from source: unknown 27712 1727096484.76652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096484.76750: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096484.76758: variable 'omit' from source: magic vars 27712 1727096484.76763: starting attempt loop 27712 1727096484.76765: running the handler 27712 1727096484.76781: _low_level_execute_command(): starting 27712 1727096484.76790: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096484.77580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096484.77654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096484.77658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096484.77710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096484.79381: stdout chunk (state=3): >>>/root <<< 27712 1727096484.79512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096484.79543: stderr chunk (state=3): >>><<< 27712 1727096484.79547: stdout chunk (state=3): >>><<< 27712 1727096484.79566: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096484.79662: _low_level_execute_command(): starting 27712 1727096484.79666: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096484.7957575-28345-254610988315204 `" && echo ansible-tmp-1727096484.7957575-28345-254610988315204="` echo /root/.ansible/tmp/ansible-tmp-1727096484.7957575-28345-254610988315204 `" ) && sleep 0' 27712 1727096484.80186: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096484.80199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096484.80213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096484.80228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096484.80243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096484.80254: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096484.80266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096484.80360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096484.80386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096484.80401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096484.80493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096484.82361: stdout chunk (state=3): >>>ansible-tmp-1727096484.7957575-28345-254610988315204=/root/.ansible/tmp/ansible-tmp-1727096484.7957575-28345-254610988315204 <<< 27712 1727096484.82505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096484.82518: stdout chunk (state=3): >>><<< 27712 1727096484.82535: stderr chunk (state=3): >>><<< 27712 1727096484.82556: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096484.7957575-28345-254610988315204=/root/.ansible/tmp/ansible-tmp-1727096484.7957575-28345-254610988315204 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096484.82597: variable 'ansible_module_compression' from source: unknown 27712 1727096484.82656: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096484.82700: variable 'ansible_facts' from source: unknown 27712 1727096484.82874: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096484.7957575-28345-254610988315204/AnsiballZ_command.py 27712 1727096484.82993: Sending initial data 27712 1727096484.83005: Sent initial data (156 bytes) 27712 1727096484.83589: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096484.83673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096484.83714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096484.83733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096484.83758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096484.83828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096484.85411: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096484.85429: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096484.85478: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpd1_r51if /root/.ansible/tmp/ansible-tmp-1727096484.7957575-28345-254610988315204/AnsiballZ_command.py <<< 27712 1727096484.85481: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096484.7957575-28345-254610988315204/AnsiballZ_command.py" <<< 27712 1727096484.85529: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpd1_r51if" to remote "/root/.ansible/tmp/ansible-tmp-1727096484.7957575-28345-254610988315204/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096484.7957575-28345-254610988315204/AnsiballZ_command.py" <<< 27712 1727096484.86215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096484.86330: stderr chunk (state=3): >>><<< 27712 1727096484.86334: stdout chunk (state=3): >>><<< 27712 1727096484.86344: done transferring module to remote 27712 1727096484.86356: _low_level_execute_command(): starting 27712 1727096484.86428: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096484.7957575-28345-254610988315204/ /root/.ansible/tmp/ansible-tmp-1727096484.7957575-28345-254610988315204/AnsiballZ_command.py && sleep 0' 27712 1727096484.86914: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096484.86931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096484.86948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096484.86972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096484.86991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096484.87004: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096484.87019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096484.87039: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096484.87083: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096484.87136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096484.87154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096484.87179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096484.87251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096484.89035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096484.89055: stdout chunk (state=3): >>><<< 27712 1727096484.89070: stderr chunk (state=3): >>><<< 27712 1727096484.89093: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096484.89102: _low_level_execute_command(): starting 27712 1727096484.89112: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096484.7957575-28345-254610988315204/AnsiballZ_command.py && sleep 0' 27712 1727096484.89756: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096484.89773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096484.89790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096484.89809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096484.89842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096484.89887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096484.89957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096484.89991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096484.90022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096484.90075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.06589: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1"], "start": "2024-09-23 09:01:25.057022", "end": "2024-09-23 09:01:25.064521", "delta": "0:00:00.007499", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest1 type veth peer name peerethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096485.09065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096485.09093: stderr chunk (state=3): >>><<< 27712 1727096485.09097: stdout chunk (state=3): >>><<< 27712 1727096485.09113: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1"], "start": "2024-09-23 09:01:25.057022", "end": "2024-09-23 09:01:25.064521", "delta": "0:00:00.007499", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest1 type veth peer name peerethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096485.09146: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest1 type veth peer name peerethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096484.7957575-28345-254610988315204/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096485.09154: _low_level_execute_command(): starting 27712 1727096485.09159: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096484.7957575-28345-254610988315204/ > /dev/null 2>&1 && sleep 0' 27712 1727096485.09610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.09613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.09616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096485.09619: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.09621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.09672: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.09676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096485.09684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.09723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.13530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.13550: stderr chunk (state=3): >>><<< 27712 1727096485.13553: stdout chunk (state=3): >>><<< 27712 1727096485.13579: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096485.13582: handler run complete 27712 1727096485.13605: Evaluated conditional (False): False 27712 1727096485.13612: attempt loop complete, returning result 27712 1727096485.13627: variable 'item' from source: unknown 27712 1727096485.13690: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link add ethtest1 type veth peer name peerethtest1) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1" ], "delta": "0:00:00.007499", "end": "2024-09-23 09:01:25.064521", "item": "ip link add ethtest1 type veth peer name peerethtest1", "rc": 0, "start": "2024-09-23 09:01:25.057022" } 27712 1727096485.13883: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096485.13886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096485.13888: variable 'omit' from source: magic vars 27712 1727096485.13949: variable 'ansible_distribution_major_version' from source: facts 27712 1727096485.13952: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096485.14077: variable 'type' from source: set_fact 27712 1727096485.14080: variable 'state' from source: include params 27712 1727096485.14085: variable 'interface' from source: set_fact 27712 1727096485.14096: variable 'current_interfaces' from source: set_fact 27712 1727096485.14099: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27712 1727096485.14101: variable 'omit' from source: magic vars 27712 1727096485.14111: variable 'omit' from source: magic vars 27712 1727096485.14136: variable 'item' from source: unknown 27712 1727096485.14182: variable 'item' from source: unknown 27712 1727096485.14200: variable 'omit' from source: magic vars 27712 1727096485.14214: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096485.14220: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096485.14227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096485.14237: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096485.14240: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096485.14242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096485.14295: Set connection var ansible_connection to ssh 27712 1727096485.14300: Set connection var ansible_pipelining to False 27712 1727096485.14308: Set connection var ansible_timeout to 10 27712 1727096485.14311: Set connection var ansible_shell_type to sh 27712 1727096485.14315: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096485.14320: Set connection var ansible_shell_executable to /bin/sh 27712 1727096485.14335: variable 'ansible_shell_executable' from source: unknown 27712 1727096485.14338: variable 'ansible_connection' from source: unknown 27712 1727096485.14341: variable 'ansible_module_compression' from source: unknown 27712 1727096485.14343: variable 'ansible_shell_type' from source: unknown 27712 1727096485.14345: variable 'ansible_shell_executable' from source: unknown 27712 1727096485.14347: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096485.14351: variable 'ansible_pipelining' from source: unknown 27712 1727096485.14354: variable 'ansible_timeout' from source: unknown 27712 1727096485.14358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096485.14424: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096485.14436: variable 'omit' from source: magic vars 27712 1727096485.14439: starting attempt loop 27712 1727096485.14441: running the handler 27712 1727096485.14445: _low_level_execute_command(): starting 27712 1727096485.14449: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096485.14851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.14859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096485.14879: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096485.14882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.14892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.14901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.14955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.14958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096485.14962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.15002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.16609: stdout chunk (state=3): >>>/root <<< 27712 1727096485.16705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.16734: stderr chunk (state=3): >>><<< 27712 1727096485.16737: stdout chunk (state=3): >>><<< 27712 1727096485.16749: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096485.16757: _low_level_execute_command(): starting 27712 1727096485.16762: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096485.1674902-28345-207880075408569 `" && echo ansible-tmp-1727096485.1674902-28345-207880075408569="` echo /root/.ansible/tmp/ansible-tmp-1727096485.1674902-28345-207880075408569 `" ) && sleep 0' 27712 1727096485.17168: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096485.17203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096485.17206: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096485.17208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.17210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.17214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096485.17216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.17263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.17272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.17300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.19169: stdout chunk (state=3): >>>ansible-tmp-1727096485.1674902-28345-207880075408569=/root/.ansible/tmp/ansible-tmp-1727096485.1674902-28345-207880075408569 <<< 27712 1727096485.19283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.19304: stderr chunk (state=3): >>><<< 27712 1727096485.19307: stdout chunk (state=3): >>><<< 27712 1727096485.19320: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096485.1674902-28345-207880075408569=/root/.ansible/tmp/ansible-tmp-1727096485.1674902-28345-207880075408569 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096485.19338: variable 'ansible_module_compression' from source: unknown 27712 1727096485.19370: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096485.19387: variable 'ansible_facts' from source: unknown 27712 1727096485.19432: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096485.1674902-28345-207880075408569/AnsiballZ_command.py 27712 1727096485.19525: Sending initial data 27712 1727096485.19529: Sent initial data (156 bytes) 27712 1727096485.19946: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.19953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096485.19977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.19992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096485.19998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.20045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.20048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096485.20052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.20085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.21622: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 27712 1727096485.21629: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096485.21653: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096485.21689: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpl_bv0u92 /root/.ansible/tmp/ansible-tmp-1727096485.1674902-28345-207880075408569/AnsiballZ_command.py <<< 27712 1727096485.21701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096485.1674902-28345-207880075408569/AnsiballZ_command.py" <<< 27712 1727096485.21719: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpl_bv0u92" to remote "/root/.ansible/tmp/ansible-tmp-1727096485.1674902-28345-207880075408569/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096485.1674902-28345-207880075408569/AnsiballZ_command.py" <<< 27712 1727096485.22206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.22244: stderr chunk (state=3): >>><<< 27712 1727096485.22249: stdout chunk (state=3): >>><<< 27712 1727096485.22274: done transferring module to remote 27712 1727096485.22281: _low_level_execute_command(): starting 27712 1727096485.22286: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096485.1674902-28345-207880075408569/ /root/.ansible/tmp/ansible-tmp-1727096485.1674902-28345-207880075408569/AnsiballZ_command.py && sleep 0' 27712 1727096485.22714: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096485.22717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096485.22720: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.22722: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096485.22724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096485.22725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.22771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.22775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.22813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.24547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.24573: stderr chunk (state=3): >>><<< 27712 1727096485.24576: stdout chunk (state=3): >>><<< 27712 1727096485.24591: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096485.24594: _low_level_execute_command(): starting 27712 1727096485.24603: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096485.1674902-28345-207880075408569/AnsiballZ_command.py && sleep 0' 27712 1727096485.25014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096485.25017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096485.25019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096485.25021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096485.25023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.25073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.25081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.25118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.40665: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest1", "up"], "start": "2024-09-23 09:01:25.401703", "end": "2024-09-23 09:01:25.405556", "delta": "0:00:00.003853", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 27712 1727096485.40677: stdout chunk (state=3): >>> <<< 27712 1727096485.42165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096485.42202: stderr chunk (state=3): >>><<< 27712 1727096485.42205: stdout chunk (state=3): >>><<< 27712 1727096485.42220: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest1", "up"], "start": "2024-09-23 09:01:25.401703", "end": "2024-09-23 09:01:25.405556", "delta": "0:00:00.003853", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096485.42246: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest1 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096485.1674902-28345-207880075408569/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096485.42252: _low_level_execute_command(): starting 27712 1727096485.42257: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096485.1674902-28345-207880075408569/ > /dev/null 2>&1 && sleep 0' 27712 1727096485.42712: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096485.42716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096485.42718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.42720: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.42722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.42773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.42791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096485.42795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.42822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.44617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.44641: stderr chunk (state=3): >>><<< 27712 1727096485.44645: stdout chunk (state=3): >>><<< 27712 1727096485.44661: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096485.44666: handler run complete 27712 1727096485.44686: Evaluated conditional (False): False 27712 1727096485.44694: attempt loop complete, returning result 27712 1727096485.44710: variable 'item' from source: unknown 27712 1727096485.44769: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set peerethtest1 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest1", "up" ], "delta": "0:00:00.003853", "end": "2024-09-23 09:01:25.405556", "item": "ip link set peerethtest1 up", "rc": 0, "start": "2024-09-23 09:01:25.401703" } 27712 1727096485.44882: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096485.44885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096485.44887: variable 'omit' from source: magic vars 27712 1727096485.44986: variable 'ansible_distribution_major_version' from source: facts 27712 1727096485.44989: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096485.45113: variable 'type' from source: set_fact 27712 1727096485.45117: variable 'state' from source: include params 27712 1727096485.45119: variable 'interface' from source: set_fact 27712 1727096485.45121: variable 'current_interfaces' from source: set_fact 27712 1727096485.45127: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 27712 1727096485.45131: variable 'omit' from source: magic vars 27712 1727096485.45142: variable 'omit' from source: magic vars 27712 1727096485.45166: variable 'item' from source: unknown 27712 1727096485.45213: variable 'item' from source: unknown 27712 1727096485.45226: variable 'omit' from source: magic vars 27712 1727096485.45245: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096485.45251: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096485.45257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096485.45266: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096485.45271: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096485.45277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096485.45322: Set connection var ansible_connection to ssh 27712 1727096485.45327: Set connection var ansible_pipelining to False 27712 1727096485.45334: Set connection var ansible_timeout to 10 27712 1727096485.45336: Set connection var ansible_shell_type to sh 27712 1727096485.45344: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096485.45346: Set connection var ansible_shell_executable to /bin/sh 27712 1727096485.45360: variable 'ansible_shell_executable' from source: unknown 27712 1727096485.45362: variable 'ansible_connection' from source: unknown 27712 1727096485.45365: variable 'ansible_module_compression' from source: unknown 27712 1727096485.45368: variable 'ansible_shell_type' from source: unknown 27712 1727096485.45371: variable 'ansible_shell_executable' from source: unknown 27712 1727096485.45376: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096485.45380: variable 'ansible_pipelining' from source: unknown 27712 1727096485.45382: variable 'ansible_timeout' from source: unknown 27712 1727096485.45387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096485.45450: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096485.45457: variable 'omit' from source: magic vars 27712 1727096485.45459: starting attempt loop 27712 1727096485.45462: running the handler 27712 1727096485.45470: _low_level_execute_command(): starting 27712 1727096485.45476: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096485.45914: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.45918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.45920: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096485.45922: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.45924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.45967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.45976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.46017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.47611: stdout chunk (state=3): >>>/root <<< 27712 1727096485.47715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.47740: stderr chunk (state=3): >>><<< 27712 1727096485.47744: stdout chunk (state=3): >>><<< 27712 1727096485.47757: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096485.47764: _low_level_execute_command(): starting 27712 1727096485.47773: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096485.4775674-28345-251164204266694 `" && echo ansible-tmp-1727096485.4775674-28345-251164204266694="` echo /root/.ansible/tmp/ansible-tmp-1727096485.4775674-28345-251164204266694 `" ) && sleep 0' 27712 1727096485.48173: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.48177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096485.48210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096485.48213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.48215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.48217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096485.48219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.48265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.48271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.48313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.50165: stdout chunk (state=3): >>>ansible-tmp-1727096485.4775674-28345-251164204266694=/root/.ansible/tmp/ansible-tmp-1727096485.4775674-28345-251164204266694 <<< 27712 1727096485.50278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.50298: stderr chunk (state=3): >>><<< 27712 1727096485.50301: stdout chunk (state=3): >>><<< 27712 1727096485.50313: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096485.4775674-28345-251164204266694=/root/.ansible/tmp/ansible-tmp-1727096485.4775674-28345-251164204266694 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096485.50332: variable 'ansible_module_compression' from source: unknown 27712 1727096485.50358: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096485.50376: variable 'ansible_facts' from source: unknown 27712 1727096485.50420: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096485.4775674-28345-251164204266694/AnsiballZ_command.py 27712 1727096485.50505: Sending initial data 27712 1727096485.50509: Sent initial data (156 bytes) 27712 1727096485.50928: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.50931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.50947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.51000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.51003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.51039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.52555: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 27712 1727096485.52559: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096485.52590: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096485.52618: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmplbbve03d /root/.ansible/tmp/ansible-tmp-1727096485.4775674-28345-251164204266694/AnsiballZ_command.py <<< 27712 1727096485.52630: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096485.4775674-28345-251164204266694/AnsiballZ_command.py" <<< 27712 1727096485.52648: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmplbbve03d" to remote "/root/.ansible/tmp/ansible-tmp-1727096485.4775674-28345-251164204266694/AnsiballZ_command.py" <<< 27712 1727096485.52655: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096485.4775674-28345-251164204266694/AnsiballZ_command.py" <<< 27712 1727096485.53137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.53178: stderr chunk (state=3): >>><<< 27712 1727096485.53182: stdout chunk (state=3): >>><<< 27712 1727096485.53205: done transferring module to remote 27712 1727096485.53211: _low_level_execute_command(): starting 27712 1727096485.53216: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096485.4775674-28345-251164204266694/ /root/.ansible/tmp/ansible-tmp-1727096485.4775674-28345-251164204266694/AnsiballZ_command.py && sleep 0' 27712 1727096485.53644: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096485.53647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096485.53650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.53652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.53654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.53706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.53712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096485.53714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.53743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.55503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.55525: stderr chunk (state=3): >>><<< 27712 1727096485.55528: stdout chunk (state=3): >>><<< 27712 1727096485.55541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096485.55548: _low_level_execute_command(): starting 27712 1727096485.55551: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096485.4775674-28345-251164204266694/AnsiballZ_command.py && sleep 0' 27712 1727096485.55956: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.55988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096485.55991: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096485.55993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.55996: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.56045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.56048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.56098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.71793: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest1", "up"], "start": "2024-09-23 09:01:25.712816", "end": "2024-09-23 09:01:25.716417", "delta": "0:00:00.003601", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096485.73322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096485.73348: stderr chunk (state=3): >>><<< 27712 1727096485.73352: stdout chunk (state=3): >>><<< 27712 1727096485.73376: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest1", "up"], "start": "2024-09-23 09:01:25.712816", "end": "2024-09-23 09:01:25.716417", "delta": "0:00:00.003601", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096485.73398: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest1 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096485.4775674-28345-251164204266694/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096485.73403: _low_level_execute_command(): starting 27712 1727096485.73408: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096485.4775674-28345-251164204266694/ > /dev/null 2>&1 && sleep 0' 27712 1727096485.73836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096485.73840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096485.73871: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096485.73874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27712 1727096485.73877: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.73879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.73940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.73943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096485.73947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.73983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.75794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.75821: stderr chunk (state=3): >>><<< 27712 1727096485.75826: stdout chunk (state=3): >>><<< 27712 1727096485.75843: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096485.75851: handler run complete 27712 1727096485.75863: Evaluated conditional (False): False 27712 1727096485.75872: attempt loop complete, returning result 27712 1727096485.75890: variable 'item' from source: unknown 27712 1727096485.75953: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set ethtest1 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest1", "up" ], "delta": "0:00:00.003601", "end": "2024-09-23 09:01:25.716417", "item": "ip link set ethtest1 up", "rc": 0, "start": "2024-09-23 09:01:25.712816" } 27712 1727096485.76066: dumping result to json 27712 1727096485.76071: done dumping result, returning 27712 1727096485.76074: done running TaskExecutor() for managed_node2/TASK: Create veth interface ethtest1 [0afff68d-5257-cbc7-8716-000000000300] 27712 1727096485.76075: sending task result for task 0afff68d-5257-cbc7-8716-000000000300 27712 1727096485.76247: no more pending results, returning what we have 27712 1727096485.76250: results queue empty 27712 1727096485.76251: checking for any_errors_fatal 27712 1727096485.76254: done checking for any_errors_fatal 27712 1727096485.76254: checking for max_fail_percentage 27712 1727096485.76256: done checking for max_fail_percentage 27712 1727096485.76256: checking to see if all hosts have failed and the running result is not ok 27712 1727096485.76257: done checking to see if all hosts have failed 27712 1727096485.76258: getting the remaining hosts for this loop 27712 1727096485.76259: done getting the remaining hosts for this loop 27712 1727096485.76261: getting the next task for host managed_node2 27712 1727096485.76266: done getting next task for host managed_node2 27712 1727096485.76270: ^ task is: TASK: Set up veth as managed by NetworkManager 27712 1727096485.76273: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096485.76276: getting variables 27712 1727096485.76285: in VariableManager get_vars() 27712 1727096485.76313: Calling all_inventory to load vars for managed_node2 27712 1727096485.76316: Calling groups_inventory to load vars for managed_node2 27712 1727096485.76318: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096485.76323: done sending task result for task 0afff68d-5257-cbc7-8716-000000000300 27712 1727096485.76325: WORKER PROCESS EXITING 27712 1727096485.76333: Calling all_plugins_play to load vars for managed_node2 27712 1727096485.76336: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096485.76338: Calling groups_plugins_play to load vars for managed_node2 27712 1727096485.76451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096485.76574: done with get_vars() 27712 1727096485.76582: done getting variables 27712 1727096485.76624: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Monday 23 September 2024 09:01:25 -0400 (0:00:01.019) 0:00:11.459 ****** 27712 1727096485.76643: entering _queue_task() for managed_node2/command 27712 1727096485.76842: worker is 1 (out of 1 available) 27712 1727096485.76857: exiting _queue_task() for managed_node2/command 27712 1727096485.76871: done queuing things up, now waiting for results queue to drain 27712 1727096485.76873: waiting for pending results... 27712 1727096485.77027: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 27712 1727096485.77088: in run() - task 0afff68d-5257-cbc7-8716-000000000301 27712 1727096485.77104: variable 'ansible_search_path' from source: unknown 27712 1727096485.77108: variable 'ansible_search_path' from source: unknown 27712 1727096485.77131: calling self._execute() 27712 1727096485.77199: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096485.77205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096485.77216: variable 'omit' from source: magic vars 27712 1727096485.77468: variable 'ansible_distribution_major_version' from source: facts 27712 1727096485.77481: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096485.77587: variable 'type' from source: set_fact 27712 1727096485.77591: variable 'state' from source: include params 27712 1727096485.77596: Evaluated conditional (type == 'veth' and state == 'present'): True 27712 1727096485.77602: variable 'omit' from source: magic vars 27712 1727096485.77626: variable 'omit' from source: magic vars 27712 1727096485.77697: variable 'interface' from source: set_fact 27712 1727096485.77710: variable 'omit' from source: magic vars 27712 1727096485.77739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096485.77769: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096485.77790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096485.77803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096485.77813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096485.77834: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096485.77837: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096485.77840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096485.77914: Set connection var ansible_connection to ssh 27712 1727096485.77920: Set connection var ansible_pipelining to False 27712 1727096485.77926: Set connection var ansible_timeout to 10 27712 1727096485.77928: Set connection var ansible_shell_type to sh 27712 1727096485.77935: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096485.77939: Set connection var ansible_shell_executable to /bin/sh 27712 1727096485.77956: variable 'ansible_shell_executable' from source: unknown 27712 1727096485.77958: variable 'ansible_connection' from source: unknown 27712 1727096485.77961: variable 'ansible_module_compression' from source: unknown 27712 1727096485.77963: variable 'ansible_shell_type' from source: unknown 27712 1727096485.77965: variable 'ansible_shell_executable' from source: unknown 27712 1727096485.77979: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096485.77983: variable 'ansible_pipelining' from source: unknown 27712 1727096485.77985: variable 'ansible_timeout' from source: unknown 27712 1727096485.77987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096485.78077: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096485.78086: variable 'omit' from source: magic vars 27712 1727096485.78095: starting attempt loop 27712 1727096485.78098: running the handler 27712 1727096485.78111: _low_level_execute_command(): starting 27712 1727096485.78118: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096485.78618: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.78621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.78626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.78629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.78673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.78691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096485.78695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.78728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.80357: stdout chunk (state=3): >>>/root <<< 27712 1727096485.80449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.80481: stderr chunk (state=3): >>><<< 27712 1727096485.80485: stdout chunk (state=3): >>><<< 27712 1727096485.80503: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096485.80513: _low_level_execute_command(): starting 27712 1727096485.80519: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096485.8050196-28386-52030129535278 `" && echo ansible-tmp-1727096485.8050196-28386-52030129535278="` echo /root/.ansible/tmp/ansible-tmp-1727096485.8050196-28386-52030129535278 `" ) && sleep 0' 27712 1727096485.80937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.80946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.80949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096485.80951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096485.80953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.80996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.80999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.81038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.82917: stdout chunk (state=3): >>>ansible-tmp-1727096485.8050196-28386-52030129535278=/root/.ansible/tmp/ansible-tmp-1727096485.8050196-28386-52030129535278 <<< 27712 1727096485.83023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.83049: stderr chunk (state=3): >>><<< 27712 1727096485.83054: stdout chunk (state=3): >>><<< 27712 1727096485.83072: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096485.8050196-28386-52030129535278=/root/.ansible/tmp/ansible-tmp-1727096485.8050196-28386-52030129535278 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096485.83097: variable 'ansible_module_compression' from source: unknown 27712 1727096485.83135: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096485.83166: variable 'ansible_facts' from source: unknown 27712 1727096485.83224: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096485.8050196-28386-52030129535278/AnsiballZ_command.py 27712 1727096485.83322: Sending initial data 27712 1727096485.83325: Sent initial data (155 bytes) 27712 1727096485.83743: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096485.83747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.83757: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.83813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.83821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.83853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.85390: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 27712 1727096485.85403: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096485.85419: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096485.85452: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpdgbrfj8t /root/.ansible/tmp/ansible-tmp-1727096485.8050196-28386-52030129535278/AnsiballZ_command.py <<< 27712 1727096485.85466: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096485.8050196-28386-52030129535278/AnsiballZ_command.py" <<< 27712 1727096485.85488: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 27712 1727096485.85491: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpdgbrfj8t" to remote "/root/.ansible/tmp/ansible-tmp-1727096485.8050196-28386-52030129535278/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096485.8050196-28386-52030129535278/AnsiballZ_command.py" <<< 27712 1727096485.85965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.86006: stderr chunk (state=3): >>><<< 27712 1727096485.86009: stdout chunk (state=3): >>><<< 27712 1727096485.86046: done transferring module to remote 27712 1727096485.86055: _low_level_execute_command(): starting 27712 1727096485.86058: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096485.8050196-28386-52030129535278/ /root/.ansible/tmp/ansible-tmp-1727096485.8050196-28386-52030129535278/AnsiballZ_command.py && sleep 0' 27712 1727096485.86487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096485.86490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096485.86493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.86498: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096485.86501: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.86545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.86548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.86587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096485.88338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096485.88363: stderr chunk (state=3): >>><<< 27712 1727096485.88366: stdout chunk (state=3): >>><<< 27712 1727096485.88384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096485.88387: _low_level_execute_command(): starting 27712 1727096485.88391: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096485.8050196-28386-52030129535278/AnsiballZ_command.py && sleep 0' 27712 1727096485.88814: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096485.88817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.88819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096485.88821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096485.88823: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096485.88874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096485.88881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096485.88919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096486.06012: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest1", "managed", "true"], "start": "2024-09-23 09:01:26.039491", "end": "2024-09-23 09:01:26.058127", "delta": "0:00:00.018636", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest1 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096486.07538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096486.07573: stderr chunk (state=3): >>><<< 27712 1727096486.07577: stdout chunk (state=3): >>><<< 27712 1727096486.07590: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest1", "managed", "true"], "start": "2024-09-23 09:01:26.039491", "end": "2024-09-23 09:01:26.058127", "delta": "0:00:00.018636", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest1 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096486.07619: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest1 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096485.8050196-28386-52030129535278/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096486.07626: _low_level_execute_command(): starting 27712 1727096486.07631: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096485.8050196-28386-52030129535278/ > /dev/null 2>&1 && sleep 0' 27712 1727096486.08092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096486.08102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096486.08104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096486.08106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096486.08108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096486.08110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096486.08154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096486.08157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096486.08160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096486.08196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096486.10007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096486.10032: stderr chunk (state=3): >>><<< 27712 1727096486.10035: stdout chunk (state=3): >>><<< 27712 1727096486.10053: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096486.10056: handler run complete 27712 1727096486.10077: Evaluated conditional (False): False 27712 1727096486.10085: attempt loop complete, returning result 27712 1727096486.10088: _execute() done 27712 1727096486.10090: dumping result to json 27712 1727096486.10096: done dumping result, returning 27712 1727096486.10103: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [0afff68d-5257-cbc7-8716-000000000301] 27712 1727096486.10108: sending task result for task 0afff68d-5257-cbc7-8716-000000000301 27712 1727096486.10203: done sending task result for task 0afff68d-5257-cbc7-8716-000000000301 27712 1727096486.10206: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest1", "managed", "true" ], "delta": "0:00:00.018636", "end": "2024-09-23 09:01:26.058127", "rc": 0, "start": "2024-09-23 09:01:26.039491" } 27712 1727096486.10271: no more pending results, returning what we have 27712 1727096486.10274: results queue empty 27712 1727096486.10275: checking for any_errors_fatal 27712 1727096486.10292: done checking for any_errors_fatal 27712 1727096486.10293: checking for max_fail_percentage 27712 1727096486.10295: done checking for max_fail_percentage 27712 1727096486.10295: checking to see if all hosts have failed and the running result is not ok 27712 1727096486.10296: done checking to see if all hosts have failed 27712 1727096486.10297: getting the remaining hosts for this loop 27712 1727096486.10298: done getting the remaining hosts for this loop 27712 1727096486.10301: getting the next task for host managed_node2 27712 1727096486.10308: done getting next task for host managed_node2 27712 1727096486.10311: ^ task is: TASK: Delete veth interface {{ interface }} 27712 1727096486.10314: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096486.10320: getting variables 27712 1727096486.10321: in VariableManager get_vars() 27712 1727096486.10359: Calling all_inventory to load vars for managed_node2 27712 1727096486.10361: Calling groups_inventory to load vars for managed_node2 27712 1727096486.10364: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.10380: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.10383: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.10386: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.10521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.10664: done with get_vars() 27712 1727096486.10674: done getting variables 27712 1727096486.10718: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096486.10804: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest1] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Monday 23 September 2024 09:01:26 -0400 (0:00:00.341) 0:00:11.801 ****** 27712 1727096486.10827: entering _queue_task() for managed_node2/command 27712 1727096486.11020: worker is 1 (out of 1 available) 27712 1727096486.11035: exiting _queue_task() for managed_node2/command 27712 1727096486.11047: done queuing things up, now waiting for results queue to drain 27712 1727096486.11049: waiting for pending results... 27712 1727096486.11205: running TaskExecutor() for managed_node2/TASK: Delete veth interface ethtest1 27712 1727096486.11271: in run() - task 0afff68d-5257-cbc7-8716-000000000302 27712 1727096486.11286: variable 'ansible_search_path' from source: unknown 27712 1727096486.11290: variable 'ansible_search_path' from source: unknown 27712 1727096486.11317: calling self._execute() 27712 1727096486.11379: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.11385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.11396: variable 'omit' from source: magic vars 27712 1727096486.11646: variable 'ansible_distribution_major_version' from source: facts 27712 1727096486.11654: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096486.11788: variable 'type' from source: set_fact 27712 1727096486.11792: variable 'state' from source: include params 27712 1727096486.11794: variable 'interface' from source: set_fact 27712 1727096486.11799: variable 'current_interfaces' from source: set_fact 27712 1727096486.11807: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 27712 1727096486.11809: when evaluation is False, skipping this task 27712 1727096486.11812: _execute() done 27712 1727096486.11815: dumping result to json 27712 1727096486.11818: done dumping result, returning 27712 1727096486.11825: done running TaskExecutor() for managed_node2/TASK: Delete veth interface ethtest1 [0afff68d-5257-cbc7-8716-000000000302] 27712 1727096486.11827: sending task result for task 0afff68d-5257-cbc7-8716-000000000302 27712 1727096486.11907: done sending task result for task 0afff68d-5257-cbc7-8716-000000000302 27712 1727096486.11910: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27712 1727096486.11980: no more pending results, returning what we have 27712 1727096486.11983: results queue empty 27712 1727096486.11984: checking for any_errors_fatal 27712 1727096486.11991: done checking for any_errors_fatal 27712 1727096486.11992: checking for max_fail_percentage 27712 1727096486.11993: done checking for max_fail_percentage 27712 1727096486.11994: checking to see if all hosts have failed and the running result is not ok 27712 1727096486.11994: done checking to see if all hosts have failed 27712 1727096486.11995: getting the remaining hosts for this loop 27712 1727096486.11996: done getting the remaining hosts for this loop 27712 1727096486.11999: getting the next task for host managed_node2 27712 1727096486.12003: done getting next task for host managed_node2 27712 1727096486.12005: ^ task is: TASK: Create dummy interface {{ interface }} 27712 1727096486.12008: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096486.12011: getting variables 27712 1727096486.12012: in VariableManager get_vars() 27712 1727096486.12041: Calling all_inventory to load vars for managed_node2 27712 1727096486.12043: Calling groups_inventory to load vars for managed_node2 27712 1727096486.12047: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.12054: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.12055: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.12057: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.12162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.12286: done with get_vars() 27712 1727096486.12293: done getting variables 27712 1727096486.12331: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096486.12407: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest1] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Monday 23 September 2024 09:01:26 -0400 (0:00:00.015) 0:00:11.817 ****** 27712 1727096486.12426: entering _queue_task() for managed_node2/command 27712 1727096486.12601: worker is 1 (out of 1 available) 27712 1727096486.12616: exiting _queue_task() for managed_node2/command 27712 1727096486.12631: done queuing things up, now waiting for results queue to drain 27712 1727096486.12632: waiting for pending results... 27712 1727096486.12774: running TaskExecutor() for managed_node2/TASK: Create dummy interface ethtest1 27712 1727096486.12832: in run() - task 0afff68d-5257-cbc7-8716-000000000303 27712 1727096486.12842: variable 'ansible_search_path' from source: unknown 27712 1727096486.12846: variable 'ansible_search_path' from source: unknown 27712 1727096486.12878: calling self._execute() 27712 1727096486.12940: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.12943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.12952: variable 'omit' from source: magic vars 27712 1727096486.13196: variable 'ansible_distribution_major_version' from source: facts 27712 1727096486.13205: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096486.13332: variable 'type' from source: set_fact 27712 1727096486.13336: variable 'state' from source: include params 27712 1727096486.13340: variable 'interface' from source: set_fact 27712 1727096486.13343: variable 'current_interfaces' from source: set_fact 27712 1727096486.13351: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 27712 1727096486.13354: when evaluation is False, skipping this task 27712 1727096486.13357: _execute() done 27712 1727096486.13359: dumping result to json 27712 1727096486.13362: done dumping result, returning 27712 1727096486.13369: done running TaskExecutor() for managed_node2/TASK: Create dummy interface ethtest1 [0afff68d-5257-cbc7-8716-000000000303] 27712 1727096486.13376: sending task result for task 0afff68d-5257-cbc7-8716-000000000303 27712 1727096486.13454: done sending task result for task 0afff68d-5257-cbc7-8716-000000000303 27712 1727096486.13457: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 27712 1727096486.13504: no more pending results, returning what we have 27712 1727096486.13507: results queue empty 27712 1727096486.13508: checking for any_errors_fatal 27712 1727096486.13512: done checking for any_errors_fatal 27712 1727096486.13513: checking for max_fail_percentage 27712 1727096486.13514: done checking for max_fail_percentage 27712 1727096486.13515: checking to see if all hosts have failed and the running result is not ok 27712 1727096486.13516: done checking to see if all hosts have failed 27712 1727096486.13516: getting the remaining hosts for this loop 27712 1727096486.13517: done getting the remaining hosts for this loop 27712 1727096486.13520: getting the next task for host managed_node2 27712 1727096486.13524: done getting next task for host managed_node2 27712 1727096486.13527: ^ task is: TASK: Delete dummy interface {{ interface }} 27712 1727096486.13529: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096486.13532: getting variables 27712 1727096486.13533: in VariableManager get_vars() 27712 1727096486.13563: Calling all_inventory to load vars for managed_node2 27712 1727096486.13565: Calling groups_inventory to load vars for managed_node2 27712 1727096486.13570: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.13578: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.13580: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.13583: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.13723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.13836: done with get_vars() 27712 1727096486.13842: done getting variables 27712 1727096486.13882: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096486.13951: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest1] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Monday 23 September 2024 09:01:26 -0400 (0:00:00.015) 0:00:11.833 ****** 27712 1727096486.13973: entering _queue_task() for managed_node2/command 27712 1727096486.14146: worker is 1 (out of 1 available) 27712 1727096486.14158: exiting _queue_task() for managed_node2/command 27712 1727096486.14173: done queuing things up, now waiting for results queue to drain 27712 1727096486.14174: waiting for pending results... 27712 1727096486.14310: running TaskExecutor() for managed_node2/TASK: Delete dummy interface ethtest1 27712 1727096486.14372: in run() - task 0afff68d-5257-cbc7-8716-000000000304 27712 1727096486.14387: variable 'ansible_search_path' from source: unknown 27712 1727096486.14391: variable 'ansible_search_path' from source: unknown 27712 1727096486.14418: calling self._execute() 27712 1727096486.14478: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.14482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.14491: variable 'omit' from source: magic vars 27712 1727096486.14732: variable 'ansible_distribution_major_version' from source: facts 27712 1727096486.14738: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096486.14862: variable 'type' from source: set_fact 27712 1727096486.14866: variable 'state' from source: include params 27712 1727096486.14871: variable 'interface' from source: set_fact 27712 1727096486.14878: variable 'current_interfaces' from source: set_fact 27712 1727096486.14886: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 27712 1727096486.14889: when evaluation is False, skipping this task 27712 1727096486.14891: _execute() done 27712 1727096486.14893: dumping result to json 27712 1727096486.14896: done dumping result, returning 27712 1727096486.14902: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface ethtest1 [0afff68d-5257-cbc7-8716-000000000304] 27712 1727096486.14907: sending task result for task 0afff68d-5257-cbc7-8716-000000000304 27712 1727096486.14983: done sending task result for task 0afff68d-5257-cbc7-8716-000000000304 27712 1727096486.14985: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27712 1727096486.15029: no more pending results, returning what we have 27712 1727096486.15033: results queue empty 27712 1727096486.15034: checking for any_errors_fatal 27712 1727096486.15039: done checking for any_errors_fatal 27712 1727096486.15040: checking for max_fail_percentage 27712 1727096486.15041: done checking for max_fail_percentage 27712 1727096486.15042: checking to see if all hosts have failed and the running result is not ok 27712 1727096486.15042: done checking to see if all hosts have failed 27712 1727096486.15043: getting the remaining hosts for this loop 27712 1727096486.15044: done getting the remaining hosts for this loop 27712 1727096486.15048: getting the next task for host managed_node2 27712 1727096486.15052: done getting next task for host managed_node2 27712 1727096486.15054: ^ task is: TASK: Create tap interface {{ interface }} 27712 1727096486.15057: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096486.15060: getting variables 27712 1727096486.15061: in VariableManager get_vars() 27712 1727096486.15093: Calling all_inventory to load vars for managed_node2 27712 1727096486.15095: Calling groups_inventory to load vars for managed_node2 27712 1727096486.15097: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.15105: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.15108: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.15110: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.15219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.15336: done with get_vars() 27712 1727096486.15343: done getting variables 27712 1727096486.15383: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096486.15458: variable 'interface' from source: set_fact TASK [Create tap interface ethtest1] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Monday 23 September 2024 09:01:26 -0400 (0:00:00.015) 0:00:11.848 ****** 27712 1727096486.15479: entering _queue_task() for managed_node2/command 27712 1727096486.15646: worker is 1 (out of 1 available) 27712 1727096486.15659: exiting _queue_task() for managed_node2/command 27712 1727096486.15671: done queuing things up, now waiting for results queue to drain 27712 1727096486.15673: waiting for pending results... 27712 1727096486.15813: running TaskExecutor() for managed_node2/TASK: Create tap interface ethtest1 27712 1727096486.15869: in run() - task 0afff68d-5257-cbc7-8716-000000000305 27712 1727096486.15883: variable 'ansible_search_path' from source: unknown 27712 1727096486.15887: variable 'ansible_search_path' from source: unknown 27712 1727096486.15915: calling self._execute() 27712 1727096486.15977: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.15982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.15991: variable 'omit' from source: magic vars 27712 1727096486.16227: variable 'ansible_distribution_major_version' from source: facts 27712 1727096486.16231: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096486.16360: variable 'type' from source: set_fact 27712 1727096486.16364: variable 'state' from source: include params 27712 1727096486.16366: variable 'interface' from source: set_fact 27712 1727096486.16375: variable 'current_interfaces' from source: set_fact 27712 1727096486.16383: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 27712 1727096486.16385: when evaluation is False, skipping this task 27712 1727096486.16389: _execute() done 27712 1727096486.16391: dumping result to json 27712 1727096486.16393: done dumping result, returning 27712 1727096486.16399: done running TaskExecutor() for managed_node2/TASK: Create tap interface ethtest1 [0afff68d-5257-cbc7-8716-000000000305] 27712 1727096486.16404: sending task result for task 0afff68d-5257-cbc7-8716-000000000305 27712 1727096486.16482: done sending task result for task 0afff68d-5257-cbc7-8716-000000000305 27712 1727096486.16485: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 27712 1727096486.16531: no more pending results, returning what we have 27712 1727096486.16534: results queue empty 27712 1727096486.16535: checking for any_errors_fatal 27712 1727096486.16539: done checking for any_errors_fatal 27712 1727096486.16539: checking for max_fail_percentage 27712 1727096486.16541: done checking for max_fail_percentage 27712 1727096486.16541: checking to see if all hosts have failed and the running result is not ok 27712 1727096486.16542: done checking to see if all hosts have failed 27712 1727096486.16543: getting the remaining hosts for this loop 27712 1727096486.16544: done getting the remaining hosts for this loop 27712 1727096486.16548: getting the next task for host managed_node2 27712 1727096486.16553: done getting next task for host managed_node2 27712 1727096486.16555: ^ task is: TASK: Delete tap interface {{ interface }} 27712 1727096486.16558: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096486.16561: getting variables 27712 1727096486.16562: in VariableManager get_vars() 27712 1727096486.16601: Calling all_inventory to load vars for managed_node2 27712 1727096486.16603: Calling groups_inventory to load vars for managed_node2 27712 1727096486.16605: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.16611: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.16613: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.16615: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.16757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.16872: done with get_vars() 27712 1727096486.16879: done getting variables 27712 1727096486.16915: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096486.16986: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest1] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Monday 23 September 2024 09:01:26 -0400 (0:00:00.015) 0:00:11.863 ****** 27712 1727096486.17005: entering _queue_task() for managed_node2/command 27712 1727096486.17178: worker is 1 (out of 1 available) 27712 1727096486.17195: exiting _queue_task() for managed_node2/command 27712 1727096486.17206: done queuing things up, now waiting for results queue to drain 27712 1727096486.17207: waiting for pending results... 27712 1727096486.17351: running TaskExecutor() for managed_node2/TASK: Delete tap interface ethtest1 27712 1727096486.17415: in run() - task 0afff68d-5257-cbc7-8716-000000000306 27712 1727096486.17426: variable 'ansible_search_path' from source: unknown 27712 1727096486.17431: variable 'ansible_search_path' from source: unknown 27712 1727096486.17459: calling self._execute() 27712 1727096486.17522: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.17525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.17534: variable 'omit' from source: magic vars 27712 1727096486.17774: variable 'ansible_distribution_major_version' from source: facts 27712 1727096486.17786: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096486.17911: variable 'type' from source: set_fact 27712 1727096486.17914: variable 'state' from source: include params 27712 1727096486.17917: variable 'interface' from source: set_fact 27712 1727096486.17920: variable 'current_interfaces' from source: set_fact 27712 1727096486.17928: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 27712 1727096486.17931: when evaluation is False, skipping this task 27712 1727096486.17933: _execute() done 27712 1727096486.17935: dumping result to json 27712 1727096486.17940: done dumping result, returning 27712 1727096486.17945: done running TaskExecutor() for managed_node2/TASK: Delete tap interface ethtest1 [0afff68d-5257-cbc7-8716-000000000306] 27712 1727096486.17950: sending task result for task 0afff68d-5257-cbc7-8716-000000000306 27712 1727096486.18024: done sending task result for task 0afff68d-5257-cbc7-8716-000000000306 27712 1727096486.18027: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 27712 1727096486.18077: no more pending results, returning what we have 27712 1727096486.18080: results queue empty 27712 1727096486.18081: checking for any_errors_fatal 27712 1727096486.18086: done checking for any_errors_fatal 27712 1727096486.18087: checking for max_fail_percentage 27712 1727096486.18088: done checking for max_fail_percentage 27712 1727096486.18089: checking to see if all hosts have failed and the running result is not ok 27712 1727096486.18090: done checking to see if all hosts have failed 27712 1727096486.18090: getting the remaining hosts for this loop 27712 1727096486.18091: done getting the remaining hosts for this loop 27712 1727096486.18094: getting the next task for host managed_node2 27712 1727096486.18099: done getting next task for host managed_node2 27712 1727096486.18102: ^ task is: TASK: Assert device is present 27712 1727096486.18104: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096486.18108: getting variables 27712 1727096486.18109: in VariableManager get_vars() 27712 1727096486.18142: Calling all_inventory to load vars for managed_node2 27712 1727096486.18145: Calling groups_inventory to load vars for managed_node2 27712 1727096486.18147: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.18155: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.18157: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.18160: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.18268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.18381: done with get_vars() 27712 1727096486.18387: done getting variables TASK [Assert device is present] ************************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:32 Monday 23 September 2024 09:01:26 -0400 (0:00:00.014) 0:00:11.877 ****** 27712 1727096486.18442: entering _queue_task() for managed_node2/include_tasks 27712 1727096486.18610: worker is 1 (out of 1 available) 27712 1727096486.18623: exiting _queue_task() for managed_node2/include_tasks 27712 1727096486.18635: done queuing things up, now waiting for results queue to drain 27712 1727096486.18636: waiting for pending results... 27712 1727096486.18780: running TaskExecutor() for managed_node2/TASK: Assert device is present 27712 1727096486.18828: in run() - task 0afff68d-5257-cbc7-8716-000000000012 27712 1727096486.18840: variable 'ansible_search_path' from source: unknown 27712 1727096486.18870: calling self._execute() 27712 1727096486.18932: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.18936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.18945: variable 'omit' from source: magic vars 27712 1727096486.19181: variable 'ansible_distribution_major_version' from source: facts 27712 1727096486.19194: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096486.19197: _execute() done 27712 1727096486.19200: dumping result to json 27712 1727096486.19202: done dumping result, returning 27712 1727096486.19208: done running TaskExecutor() for managed_node2/TASK: Assert device is present [0afff68d-5257-cbc7-8716-000000000012] 27712 1727096486.19212: sending task result for task 0afff68d-5257-cbc7-8716-000000000012 27712 1727096486.19293: done sending task result for task 0afff68d-5257-cbc7-8716-000000000012 27712 1727096486.19297: WORKER PROCESS EXITING 27712 1727096486.19324: no more pending results, returning what we have 27712 1727096486.19327: in VariableManager get_vars() 27712 1727096486.19363: Calling all_inventory to load vars for managed_node2 27712 1727096486.19366: Calling groups_inventory to load vars for managed_node2 27712 1727096486.19370: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.19379: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.19381: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.19384: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.19537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.19645: done with get_vars() 27712 1727096486.19650: variable 'ansible_search_path' from source: unknown 27712 1727096486.19659: we have included files to process 27712 1727096486.19660: generating all_blocks data 27712 1727096486.19660: done generating all_blocks data 27712 1727096486.19664: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27712 1727096486.19664: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27712 1727096486.19666: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 27712 1727096486.19729: in VariableManager get_vars() 27712 1727096486.19743: done with get_vars() 27712 1727096486.19813: done processing included file 27712 1727096486.19815: iterating over new_blocks loaded from include file 27712 1727096486.19816: in VariableManager get_vars() 27712 1727096486.19826: done with get_vars() 27712 1727096486.19827: filtering new block on tags 27712 1727096486.19837: done filtering new block on tags 27712 1727096486.19839: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 27712 1727096486.19842: extending task lists for all hosts with included blocks 27712 1727096486.20468: done extending task lists 27712 1727096486.20469: done processing included files 27712 1727096486.20470: results queue empty 27712 1727096486.20471: checking for any_errors_fatal 27712 1727096486.20473: done checking for any_errors_fatal 27712 1727096486.20473: checking for max_fail_percentage 27712 1727096486.20474: done checking for max_fail_percentage 27712 1727096486.20474: checking to see if all hosts have failed and the running result is not ok 27712 1727096486.20475: done checking to see if all hosts have failed 27712 1727096486.20475: getting the remaining hosts for this loop 27712 1727096486.20476: done getting the remaining hosts for this loop 27712 1727096486.20478: getting the next task for host managed_node2 27712 1727096486.20480: done getting next task for host managed_node2 27712 1727096486.20482: ^ task is: TASK: Include the task 'get_interface_stat.yml' 27712 1727096486.20483: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096486.20485: getting variables 27712 1727096486.20485: in VariableManager get_vars() 27712 1727096486.20494: Calling all_inventory to load vars for managed_node2 27712 1727096486.20496: Calling groups_inventory to load vars for managed_node2 27712 1727096486.20497: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.20500: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.20501: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.20503: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.20601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.20708: done with get_vars() 27712 1727096486.20716: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 09:01:26 -0400 (0:00:00.023) 0:00:11.900 ****** 27712 1727096486.20762: entering _queue_task() for managed_node2/include_tasks 27712 1727096486.20938: worker is 1 (out of 1 available) 27712 1727096486.20951: exiting _queue_task() for managed_node2/include_tasks 27712 1727096486.20964: done queuing things up, now waiting for results queue to drain 27712 1727096486.20965: waiting for pending results... 27712 1727096486.21115: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 27712 1727096486.21179: in run() - task 0afff68d-5257-cbc7-8716-0000000003eb 27712 1727096486.21190: variable 'ansible_search_path' from source: unknown 27712 1727096486.21193: variable 'ansible_search_path' from source: unknown 27712 1727096486.21222: calling self._execute() 27712 1727096486.21284: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.21287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.21296: variable 'omit' from source: magic vars 27712 1727096486.21634: variable 'ansible_distribution_major_version' from source: facts 27712 1727096486.21643: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096486.21648: _execute() done 27712 1727096486.21651: dumping result to json 27712 1727096486.21653: done dumping result, returning 27712 1727096486.21661: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-cbc7-8716-0000000003eb] 27712 1727096486.21664: sending task result for task 0afff68d-5257-cbc7-8716-0000000003eb 27712 1727096486.21751: done sending task result for task 0afff68d-5257-cbc7-8716-0000000003eb 27712 1727096486.21754: WORKER PROCESS EXITING 27712 1727096486.21801: no more pending results, returning what we have 27712 1727096486.21805: in VariableManager get_vars() 27712 1727096486.21842: Calling all_inventory to load vars for managed_node2 27712 1727096486.21844: Calling groups_inventory to load vars for managed_node2 27712 1727096486.21847: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.21856: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.21858: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.21861: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.21980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.22089: done with get_vars() 27712 1727096486.22094: variable 'ansible_search_path' from source: unknown 27712 1727096486.22095: variable 'ansible_search_path' from source: unknown 27712 1727096486.22120: we have included files to process 27712 1727096486.22121: generating all_blocks data 27712 1727096486.22122: done generating all_blocks data 27712 1727096486.22122: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27712 1727096486.22123: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27712 1727096486.22124: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27712 1727096486.22235: done processing included file 27712 1727096486.22236: iterating over new_blocks loaded from include file 27712 1727096486.22238: in VariableManager get_vars() 27712 1727096486.22250: done with get_vars() 27712 1727096486.22251: filtering new block on tags 27712 1727096486.22260: done filtering new block on tags 27712 1727096486.22262: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 27712 1727096486.22287: extending task lists for all hosts with included blocks 27712 1727096486.22345: done extending task lists 27712 1727096486.22346: done processing included files 27712 1727096486.22346: results queue empty 27712 1727096486.22347: checking for any_errors_fatal 27712 1727096486.22348: done checking for any_errors_fatal 27712 1727096486.22349: checking for max_fail_percentage 27712 1727096486.22350: done checking for max_fail_percentage 27712 1727096486.22350: checking to see if all hosts have failed and the running result is not ok 27712 1727096486.22351: done checking to see if all hosts have failed 27712 1727096486.22351: getting the remaining hosts for this loop 27712 1727096486.22352: done getting the remaining hosts for this loop 27712 1727096486.22353: getting the next task for host managed_node2 27712 1727096486.22356: done getting next task for host managed_node2 27712 1727096486.22357: ^ task is: TASK: Get stat for interface {{ interface }} 27712 1727096486.22359: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096486.22361: getting variables 27712 1727096486.22361: in VariableManager get_vars() 27712 1727096486.22372: Calling all_inventory to load vars for managed_node2 27712 1727096486.22373: Calling groups_inventory to load vars for managed_node2 27712 1727096486.22375: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.22378: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.22379: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.22381: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.22457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.22563: done with get_vars() 27712 1727096486.22573: done getting variables 27712 1727096486.22672: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest1] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 09:01:26 -0400 (0:00:00.019) 0:00:11.920 ****** 27712 1727096486.22691: entering _queue_task() for managed_node2/stat 27712 1727096486.22861: worker is 1 (out of 1 available) 27712 1727096486.22877: exiting _queue_task() for managed_node2/stat 27712 1727096486.22889: done queuing things up, now waiting for results queue to drain 27712 1727096486.22890: waiting for pending results... 27712 1727096486.23106: running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest1 27712 1727096486.23176: in run() - task 0afff68d-5257-cbc7-8716-000000000483 27712 1727096486.23180: variable 'ansible_search_path' from source: unknown 27712 1727096486.23184: variable 'ansible_search_path' from source: unknown 27712 1727096486.23190: calling self._execute() 27712 1727096486.23278: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.23289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.23303: variable 'omit' from source: magic vars 27712 1727096486.23620: variable 'ansible_distribution_major_version' from source: facts 27712 1727096486.23636: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096486.23646: variable 'omit' from source: magic vars 27712 1727096486.23775: variable 'omit' from source: magic vars 27712 1727096486.23792: variable 'interface' from source: set_fact 27712 1727096486.23814: variable 'omit' from source: magic vars 27712 1727096486.23856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096486.23899: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096486.23921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096486.23942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096486.23958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096486.23994: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096486.24003: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.24010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.24109: Set connection var ansible_connection to ssh 27712 1727096486.24124: Set connection var ansible_pipelining to False 27712 1727096486.24135: Set connection var ansible_timeout to 10 27712 1727096486.24142: Set connection var ansible_shell_type to sh 27712 1727096486.24155: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096486.24165: Set connection var ansible_shell_executable to /bin/sh 27712 1727096486.24374: variable 'ansible_shell_executable' from source: unknown 27712 1727096486.24378: variable 'ansible_connection' from source: unknown 27712 1727096486.24381: variable 'ansible_module_compression' from source: unknown 27712 1727096486.24383: variable 'ansible_shell_type' from source: unknown 27712 1727096486.24385: variable 'ansible_shell_executable' from source: unknown 27712 1727096486.24387: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.24388: variable 'ansible_pipelining' from source: unknown 27712 1727096486.24390: variable 'ansible_timeout' from source: unknown 27712 1727096486.24392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.24425: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096486.24441: variable 'omit' from source: magic vars 27712 1727096486.24451: starting attempt loop 27712 1727096486.24457: running the handler 27712 1727096486.24480: _low_level_execute_command(): starting 27712 1727096486.24492: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096486.25110: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096486.25130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096486.25150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096486.25184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096486.25198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096486.25245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096486.26928: stdout chunk (state=3): >>>/root <<< 27712 1727096486.27176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096486.27179: stdout chunk (state=3): >>><<< 27712 1727096486.27182: stderr chunk (state=3): >>><<< 27712 1727096486.27185: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096486.27188: _low_level_execute_command(): starting 27712 1727096486.27190: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096486.2709584-28401-67790806590744 `" && echo ansible-tmp-1727096486.2709584-28401-67790806590744="` echo /root/.ansible/tmp/ansible-tmp-1727096486.2709584-28401-67790806590744 `" ) && sleep 0' 27712 1727096486.27801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096486.27815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096486.27827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096486.27844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096486.27876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096486.27995: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096486.28027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096486.28041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096486.28062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096486.28135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096486.30093: stdout chunk (state=3): >>>ansible-tmp-1727096486.2709584-28401-67790806590744=/root/.ansible/tmp/ansible-tmp-1727096486.2709584-28401-67790806590744 <<< 27712 1727096486.30234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096486.30252: stderr chunk (state=3): >>><<< 27712 1727096486.30261: stdout chunk (state=3): >>><<< 27712 1727096486.30476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096486.2709584-28401-67790806590744=/root/.ansible/tmp/ansible-tmp-1727096486.2709584-28401-67790806590744 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096486.30479: variable 'ansible_module_compression' from source: unknown 27712 1727096486.30482: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27712 1727096486.30484: variable 'ansible_facts' from source: unknown 27712 1727096486.30549: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096486.2709584-28401-67790806590744/AnsiballZ_stat.py 27712 1727096486.30726: Sending initial data 27712 1727096486.30735: Sent initial data (152 bytes) 27712 1727096486.31308: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096486.31321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096486.31610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096486.31624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096486.33228: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096486.33286: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096486.33315: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096486.2709584-28401-67790806590744/AnsiballZ_stat.py" <<< 27712 1727096486.33332: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmp43wn0srg /root/.ansible/tmp/ansible-tmp-1727096486.2709584-28401-67790806590744/AnsiballZ_stat.py <<< 27712 1727096486.33366: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmp43wn0srg" to remote "/root/.ansible/tmp/ansible-tmp-1727096486.2709584-28401-67790806590744/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096486.2709584-28401-67790806590744/AnsiballZ_stat.py" <<< 27712 1727096486.34181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096486.34184: stdout chunk (state=3): >>><<< 27712 1727096486.34187: stderr chunk (state=3): >>><<< 27712 1727096486.34195: done transferring module to remote 27712 1727096486.34218: _low_level_execute_command(): starting 27712 1727096486.34234: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096486.2709584-28401-67790806590744/ /root/.ansible/tmp/ansible-tmp-1727096486.2709584-28401-67790806590744/AnsiballZ_stat.py && sleep 0' 27712 1727096486.34988: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096486.35017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096486.35044: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096486.35061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096486.35128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096486.36944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096486.36951: stdout chunk (state=3): >>><<< 27712 1727096486.36961: stderr chunk (state=3): >>><<< 27712 1727096486.37005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096486.37009: _low_level_execute_command(): starting 27712 1727096486.37011: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096486.2709584-28401-67790806590744/AnsiballZ_stat.py && sleep 0' 27712 1727096486.37413: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096486.37416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096486.37419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096486.37422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096486.37424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096486.37475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096486.37480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096486.37517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096486.52773: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30816, "dev": 23, "nlink": 1, "atime": 1727096485.0607395, "mtime": 1727096485.0607395, "ctime": 1727096485.0607395, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27712 1727096486.54121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096486.54148: stderr chunk (state=3): >>><<< 27712 1727096486.54151: stdout chunk (state=3): >>><<< 27712 1727096486.54170: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30816, "dev": 23, "nlink": 1, "atime": 1727096485.0607395, "mtime": 1727096485.0607395, "ctime": 1727096485.0607395, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096486.54211: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096486.2709584-28401-67790806590744/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096486.54219: _low_level_execute_command(): starting 27712 1727096486.54224: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096486.2709584-28401-67790806590744/ > /dev/null 2>&1 && sleep 0' 27712 1727096486.54707: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096486.54710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096486.54713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096486.54715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096486.54717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096486.54774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096486.54779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096486.54782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096486.54810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096486.56611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096486.56636: stderr chunk (state=3): >>><<< 27712 1727096486.56639: stdout chunk (state=3): >>><<< 27712 1727096486.56652: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096486.56658: handler run complete 27712 1727096486.56695: attempt loop complete, returning result 27712 1727096486.56698: _execute() done 27712 1727096486.56701: dumping result to json 27712 1727096486.56706: done dumping result, returning 27712 1727096486.56714: done running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest1 [0afff68d-5257-cbc7-8716-000000000483] 27712 1727096486.56719: sending task result for task 0afff68d-5257-cbc7-8716-000000000483 27712 1727096486.56825: done sending task result for task 0afff68d-5257-cbc7-8716-000000000483 27712 1727096486.56828: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1727096485.0607395, "block_size": 4096, "blocks": 0, "ctime": 1727096485.0607395, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30816, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "mode": "0777", "mtime": 1727096485.0607395, "nlink": 1, "path": "/sys/class/net/ethtest1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 27712 1727096486.56914: no more pending results, returning what we have 27712 1727096486.56917: results queue empty 27712 1727096486.56918: checking for any_errors_fatal 27712 1727096486.56919: done checking for any_errors_fatal 27712 1727096486.56920: checking for max_fail_percentage 27712 1727096486.56921: done checking for max_fail_percentage 27712 1727096486.56922: checking to see if all hosts have failed and the running result is not ok 27712 1727096486.56923: done checking to see if all hosts have failed 27712 1727096486.56923: getting the remaining hosts for this loop 27712 1727096486.56925: done getting the remaining hosts for this loop 27712 1727096486.56928: getting the next task for host managed_node2 27712 1727096486.56935: done getting next task for host managed_node2 27712 1727096486.56940: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 27712 1727096486.56943: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096486.56946: getting variables 27712 1727096486.56947: in VariableManager get_vars() 27712 1727096486.56994: Calling all_inventory to load vars for managed_node2 27712 1727096486.56997: Calling groups_inventory to load vars for managed_node2 27712 1727096486.56999: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.57009: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.57011: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.57013: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.57162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.57283: done with get_vars() 27712 1727096486.57291: done getting variables 27712 1727096486.57333: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096486.57423: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest1'] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 09:01:26 -0400 (0:00:00.347) 0:00:12.267 ****** 27712 1727096486.57444: entering _queue_task() for managed_node2/assert 27712 1727096486.57642: worker is 1 (out of 1 available) 27712 1727096486.57656: exiting _queue_task() for managed_node2/assert 27712 1727096486.57673: done queuing things up, now waiting for results queue to drain 27712 1727096486.57675: waiting for pending results... 27712 1727096486.57827: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'ethtest1' 27712 1727096486.57892: in run() - task 0afff68d-5257-cbc7-8716-0000000003ec 27712 1727096486.57902: variable 'ansible_search_path' from source: unknown 27712 1727096486.57910: variable 'ansible_search_path' from source: unknown 27712 1727096486.57942: calling self._execute() 27712 1727096486.58007: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.58015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.58023: variable 'omit' from source: magic vars 27712 1727096486.58287: variable 'ansible_distribution_major_version' from source: facts 27712 1727096486.58296: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096486.58302: variable 'omit' from source: magic vars 27712 1727096486.58327: variable 'omit' from source: magic vars 27712 1727096486.58399: variable 'interface' from source: set_fact 27712 1727096486.58412: variable 'omit' from source: magic vars 27712 1727096486.58442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096486.58472: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096486.58491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096486.58503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096486.58513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096486.58540: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096486.58543: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.58545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.58616: Set connection var ansible_connection to ssh 27712 1727096486.58623: Set connection var ansible_pipelining to False 27712 1727096486.58628: Set connection var ansible_timeout to 10 27712 1727096486.58631: Set connection var ansible_shell_type to sh 27712 1727096486.58637: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096486.58642: Set connection var ansible_shell_executable to /bin/sh 27712 1727096486.58660: variable 'ansible_shell_executable' from source: unknown 27712 1727096486.58665: variable 'ansible_connection' from source: unknown 27712 1727096486.58675: variable 'ansible_module_compression' from source: unknown 27712 1727096486.58680: variable 'ansible_shell_type' from source: unknown 27712 1727096486.58683: variable 'ansible_shell_executable' from source: unknown 27712 1727096486.58686: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.58688: variable 'ansible_pipelining' from source: unknown 27712 1727096486.58690: variable 'ansible_timeout' from source: unknown 27712 1727096486.58692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.58788: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096486.58792: variable 'omit' from source: magic vars 27712 1727096486.58797: starting attempt loop 27712 1727096486.58799: running the handler 27712 1727096486.58886: variable 'interface_stat' from source: set_fact 27712 1727096486.58902: Evaluated conditional (interface_stat.stat.exists): True 27712 1727096486.58905: handler run complete 27712 1727096486.58917: attempt loop complete, returning result 27712 1727096486.58920: _execute() done 27712 1727096486.58922: dumping result to json 27712 1727096486.58924: done dumping result, returning 27712 1727096486.58931: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'ethtest1' [0afff68d-5257-cbc7-8716-0000000003ec] 27712 1727096486.58934: sending task result for task 0afff68d-5257-cbc7-8716-0000000003ec 27712 1727096486.59014: done sending task result for task 0afff68d-5257-cbc7-8716-0000000003ec 27712 1727096486.59017: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27712 1727096486.59066: no more pending results, returning what we have 27712 1727096486.59072: results queue empty 27712 1727096486.59073: checking for any_errors_fatal 27712 1727096486.59082: done checking for any_errors_fatal 27712 1727096486.59082: checking for max_fail_percentage 27712 1727096486.59083: done checking for max_fail_percentage 27712 1727096486.59084: checking to see if all hosts have failed and the running result is not ok 27712 1727096486.59085: done checking to see if all hosts have failed 27712 1727096486.59086: getting the remaining hosts for this loop 27712 1727096486.59087: done getting the remaining hosts for this loop 27712 1727096486.59090: getting the next task for host managed_node2 27712 1727096486.59097: done getting next task for host managed_node2 27712 1727096486.59101: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27712 1727096486.59104: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096486.59117: getting variables 27712 1727096486.59118: in VariableManager get_vars() 27712 1727096486.59157: Calling all_inventory to load vars for managed_node2 27712 1727096486.59160: Calling groups_inventory to load vars for managed_node2 27712 1727096486.59162: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.59172: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.59174: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.59177: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.59304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.59509: done with get_vars() 27712 1727096486.59521: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 09:01:26 -0400 (0:00:00.021) 0:00:12.289 ****** 27712 1727096486.59615: entering _queue_task() for managed_node2/include_tasks 27712 1727096486.59906: worker is 1 (out of 1 available) 27712 1727096486.59918: exiting _queue_task() for managed_node2/include_tasks 27712 1727096486.59985: done queuing things up, now waiting for results queue to drain 27712 1727096486.59987: waiting for pending results... 27712 1727096486.60276: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27712 1727096486.60324: in run() - task 0afff68d-5257-cbc7-8716-00000000001b 27712 1727096486.60348: variable 'ansible_search_path' from source: unknown 27712 1727096486.60357: variable 'ansible_search_path' from source: unknown 27712 1727096486.60407: calling self._execute() 27712 1727096486.60504: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.60523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.60674: variable 'omit' from source: magic vars 27712 1727096486.61320: variable 'ansible_distribution_major_version' from source: facts 27712 1727096486.61347: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096486.61359: _execute() done 27712 1727096486.61370: dumping result to json 27712 1727096486.61381: done dumping result, returning 27712 1727096486.61393: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-cbc7-8716-00000000001b] 27712 1727096486.61404: sending task result for task 0afff68d-5257-cbc7-8716-00000000001b 27712 1727096486.61597: done sending task result for task 0afff68d-5257-cbc7-8716-00000000001b 27712 1727096486.61601: WORKER PROCESS EXITING 27712 1727096486.61645: no more pending results, returning what we have 27712 1727096486.61650: in VariableManager get_vars() 27712 1727096486.61702: Calling all_inventory to load vars for managed_node2 27712 1727096486.61705: Calling groups_inventory to load vars for managed_node2 27712 1727096486.61708: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.61880: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.61883: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.61887: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.62356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.62555: done with get_vars() 27712 1727096486.62562: variable 'ansible_search_path' from source: unknown 27712 1727096486.62563: variable 'ansible_search_path' from source: unknown 27712 1727096486.62604: we have included files to process 27712 1727096486.62606: generating all_blocks data 27712 1727096486.62607: done generating all_blocks data 27712 1727096486.62610: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27712 1727096486.62611: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27712 1727096486.62613: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27712 1727096486.63344: done processing included file 27712 1727096486.63347: iterating over new_blocks loaded from include file 27712 1727096486.63348: in VariableManager get_vars() 27712 1727096486.63375: done with get_vars() 27712 1727096486.63377: filtering new block on tags 27712 1727096486.63400: done filtering new block on tags 27712 1727096486.63403: in VariableManager get_vars() 27712 1727096486.63426: done with get_vars() 27712 1727096486.63428: filtering new block on tags 27712 1727096486.63448: done filtering new block on tags 27712 1727096486.63451: in VariableManager get_vars() 27712 1727096486.63478: done with get_vars() 27712 1727096486.63480: filtering new block on tags 27712 1727096486.63504: done filtering new block on tags 27712 1727096486.63506: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 27712 1727096486.63512: extending task lists for all hosts with included blocks 27712 1727096486.64118: done extending task lists 27712 1727096486.64119: done processing included files 27712 1727096486.64120: results queue empty 27712 1727096486.64120: checking for any_errors_fatal 27712 1727096486.64122: done checking for any_errors_fatal 27712 1727096486.64123: checking for max_fail_percentage 27712 1727096486.64123: done checking for max_fail_percentage 27712 1727096486.64124: checking to see if all hosts have failed and the running result is not ok 27712 1727096486.64124: done checking to see if all hosts have failed 27712 1727096486.64124: getting the remaining hosts for this loop 27712 1727096486.64125: done getting the remaining hosts for this loop 27712 1727096486.64127: getting the next task for host managed_node2 27712 1727096486.64129: done getting next task for host managed_node2 27712 1727096486.64131: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27712 1727096486.64133: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096486.64138: getting variables 27712 1727096486.64139: in VariableManager get_vars() 27712 1727096486.64149: Calling all_inventory to load vars for managed_node2 27712 1727096486.64151: Calling groups_inventory to load vars for managed_node2 27712 1727096486.64152: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.64156: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.64157: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.64159: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.64237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.64348: done with get_vars() 27712 1727096486.64354: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 09:01:26 -0400 (0:00:00.047) 0:00:12.337 ****** 27712 1727096486.64403: entering _queue_task() for managed_node2/setup 27712 1727096486.64585: worker is 1 (out of 1 available) 27712 1727096486.64597: exiting _queue_task() for managed_node2/setup 27712 1727096486.64607: done queuing things up, now waiting for results queue to drain 27712 1727096486.64609: waiting for pending results... 27712 1727096486.64763: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27712 1727096486.64856: in run() - task 0afff68d-5257-cbc7-8716-00000000049b 27712 1727096486.64869: variable 'ansible_search_path' from source: unknown 27712 1727096486.64873: variable 'ansible_search_path' from source: unknown 27712 1727096486.64904: calling self._execute() 27712 1727096486.64971: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.64979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.64990: variable 'omit' from source: magic vars 27712 1727096486.65246: variable 'ansible_distribution_major_version' from source: facts 27712 1727096486.65255: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096486.65400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096486.66927: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096486.66971: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096486.67000: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096486.67028: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096486.67049: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096486.67118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096486.67140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096486.67157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096486.67188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096486.67200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096486.67238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096486.67255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096486.67273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096486.67301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096486.67312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096486.67468: variable '__network_required_facts' from source: role '' defaults 27712 1727096486.67472: variable 'ansible_facts' from source: unknown 27712 1727096486.67485: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 27712 1727096486.67488: when evaluation is False, skipping this task 27712 1727096486.67491: _execute() done 27712 1727096486.67493: dumping result to json 27712 1727096486.67495: done dumping result, returning 27712 1727096486.67502: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-cbc7-8716-00000000049b] 27712 1727096486.67505: sending task result for task 0afff68d-5257-cbc7-8716-00000000049b 27712 1727096486.67586: done sending task result for task 0afff68d-5257-cbc7-8716-00000000049b 27712 1727096486.67589: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096486.67630: no more pending results, returning what we have 27712 1727096486.67633: results queue empty 27712 1727096486.67634: checking for any_errors_fatal 27712 1727096486.67635: done checking for any_errors_fatal 27712 1727096486.67636: checking for max_fail_percentage 27712 1727096486.67637: done checking for max_fail_percentage 27712 1727096486.67638: checking to see if all hosts have failed and the running result is not ok 27712 1727096486.67639: done checking to see if all hosts have failed 27712 1727096486.67639: getting the remaining hosts for this loop 27712 1727096486.67640: done getting the remaining hosts for this loop 27712 1727096486.67644: getting the next task for host managed_node2 27712 1727096486.67652: done getting next task for host managed_node2 27712 1727096486.67655: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 27712 1727096486.67659: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096486.67673: getting variables 27712 1727096486.67674: in VariableManager get_vars() 27712 1727096486.67712: Calling all_inventory to load vars for managed_node2 27712 1727096486.67715: Calling groups_inventory to load vars for managed_node2 27712 1727096486.67717: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.67726: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.67728: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.67731: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.67951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.68181: done with get_vars() 27712 1727096486.68192: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 09:01:26 -0400 (0:00:00.038) 0:00:12.376 ****** 27712 1727096486.68285: entering _queue_task() for managed_node2/stat 27712 1727096486.68516: worker is 1 (out of 1 available) 27712 1727096486.68528: exiting _queue_task() for managed_node2/stat 27712 1727096486.68539: done queuing things up, now waiting for results queue to drain 27712 1727096486.68541: waiting for pending results... 27712 1727096486.68987: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 27712 1727096486.68991: in run() - task 0afff68d-5257-cbc7-8716-00000000049d 27712 1727096486.68993: variable 'ansible_search_path' from source: unknown 27712 1727096486.68996: variable 'ansible_search_path' from source: unknown 27712 1727096486.69018: calling self._execute() 27712 1727096486.69111: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.69123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.69135: variable 'omit' from source: magic vars 27712 1727096486.69499: variable 'ansible_distribution_major_version' from source: facts 27712 1727096486.69516: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096486.69684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096486.69882: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096486.69912: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096486.69941: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096486.69966: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096486.70027: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096486.70047: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096486.70068: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096486.70091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096486.70152: variable '__network_is_ostree' from source: set_fact 27712 1727096486.70160: Evaluated conditional (not __network_is_ostree is defined): False 27712 1727096486.70163: when evaluation is False, skipping this task 27712 1727096486.70166: _execute() done 27712 1727096486.70170: dumping result to json 27712 1727096486.70173: done dumping result, returning 27712 1727096486.70182: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-cbc7-8716-00000000049d] 27712 1727096486.70185: sending task result for task 0afff68d-5257-cbc7-8716-00000000049d 27712 1727096486.70263: done sending task result for task 0afff68d-5257-cbc7-8716-00000000049d 27712 1727096486.70266: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27712 1727096486.70343: no more pending results, returning what we have 27712 1727096486.70346: results queue empty 27712 1727096486.70347: checking for any_errors_fatal 27712 1727096486.70351: done checking for any_errors_fatal 27712 1727096486.70352: checking for max_fail_percentage 27712 1727096486.70353: done checking for max_fail_percentage 27712 1727096486.70354: checking to see if all hosts have failed and the running result is not ok 27712 1727096486.70355: done checking to see if all hosts have failed 27712 1727096486.70356: getting the remaining hosts for this loop 27712 1727096486.70357: done getting the remaining hosts for this loop 27712 1727096486.70360: getting the next task for host managed_node2 27712 1727096486.70365: done getting next task for host managed_node2 27712 1727096486.70370: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27712 1727096486.70373: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096486.70391: getting variables 27712 1727096486.70392: in VariableManager get_vars() 27712 1727096486.70421: Calling all_inventory to load vars for managed_node2 27712 1727096486.70423: Calling groups_inventory to load vars for managed_node2 27712 1727096486.70424: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.70430: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.70432: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.70434: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.70542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.70666: done with get_vars() 27712 1727096486.70676: done getting variables 27712 1727096486.70714: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 09:01:26 -0400 (0:00:00.024) 0:00:12.400 ****** 27712 1727096486.70738: entering _queue_task() for managed_node2/set_fact 27712 1727096486.70912: worker is 1 (out of 1 available) 27712 1727096486.70924: exiting _queue_task() for managed_node2/set_fact 27712 1727096486.70936: done queuing things up, now waiting for results queue to drain 27712 1727096486.70937: waiting for pending results... 27712 1727096486.71091: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27712 1727096486.71185: in run() - task 0afff68d-5257-cbc7-8716-00000000049e 27712 1727096486.71196: variable 'ansible_search_path' from source: unknown 27712 1727096486.71200: variable 'ansible_search_path' from source: unknown 27712 1727096486.71225: calling self._execute() 27712 1727096486.71287: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.71291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.71298: variable 'omit' from source: magic vars 27712 1727096486.71675: variable 'ansible_distribution_major_version' from source: facts 27712 1727096486.71679: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096486.71785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096486.72110: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096486.72157: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096486.72198: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096486.72231: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096486.72313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096486.72343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096486.72376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096486.72406: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096486.72490: variable '__network_is_ostree' from source: set_fact 27712 1727096486.72676: Evaluated conditional (not __network_is_ostree is defined): False 27712 1727096486.72679: when evaluation is False, skipping this task 27712 1727096486.72681: _execute() done 27712 1727096486.72684: dumping result to json 27712 1727096486.72686: done dumping result, returning 27712 1727096486.72688: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-cbc7-8716-00000000049e] 27712 1727096486.72691: sending task result for task 0afff68d-5257-cbc7-8716-00000000049e 27712 1727096486.72747: done sending task result for task 0afff68d-5257-cbc7-8716-00000000049e 27712 1727096486.72749: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27712 1727096486.72803: no more pending results, returning what we have 27712 1727096486.72806: results queue empty 27712 1727096486.72807: checking for any_errors_fatal 27712 1727096486.72811: done checking for any_errors_fatal 27712 1727096486.72812: checking for max_fail_percentage 27712 1727096486.72813: done checking for max_fail_percentage 27712 1727096486.72814: checking to see if all hosts have failed and the running result is not ok 27712 1727096486.72815: done checking to see if all hosts have failed 27712 1727096486.72815: getting the remaining hosts for this loop 27712 1727096486.72816: done getting the remaining hosts for this loop 27712 1727096486.72819: getting the next task for host managed_node2 27712 1727096486.72826: done getting next task for host managed_node2 27712 1727096486.72829: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 27712 1727096486.72832: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096486.72859: getting variables 27712 1727096486.72861: in VariableManager get_vars() 27712 1727096486.72901: Calling all_inventory to load vars for managed_node2 27712 1727096486.72904: Calling groups_inventory to load vars for managed_node2 27712 1727096486.72906: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096486.72913: Calling all_plugins_play to load vars for managed_node2 27712 1727096486.72924: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096486.72938: Calling groups_plugins_play to load vars for managed_node2 27712 1727096486.73162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096486.73422: done with get_vars() 27712 1727096486.73431: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 09:01:26 -0400 (0:00:00.027) 0:00:12.428 ****** 27712 1727096486.73507: entering _queue_task() for managed_node2/service_facts 27712 1727096486.73509: Creating lock for service_facts 27712 1727096486.73893: worker is 1 (out of 1 available) 27712 1727096486.73905: exiting _queue_task() for managed_node2/service_facts 27712 1727096486.73917: done queuing things up, now waiting for results queue to drain 27712 1727096486.73919: waiting for pending results... 27712 1727096486.74182: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 27712 1727096486.74331: in run() - task 0afff68d-5257-cbc7-8716-0000000004a0 27712 1727096486.74350: variable 'ansible_search_path' from source: unknown 27712 1727096486.74357: variable 'ansible_search_path' from source: unknown 27712 1727096486.74402: calling self._execute() 27712 1727096486.74494: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.74505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.74522: variable 'omit' from source: magic vars 27712 1727096486.74951: variable 'ansible_distribution_major_version' from source: facts 27712 1727096486.74955: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096486.74958: variable 'omit' from source: magic vars 27712 1727096486.74979: variable 'omit' from source: magic vars 27712 1727096486.75017: variable 'omit' from source: magic vars 27712 1727096486.75059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096486.75106: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096486.75129: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096486.75149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096486.75163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096486.75201: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096486.75290: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.75294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.75315: Set connection var ansible_connection to ssh 27712 1727096486.75329: Set connection var ansible_pipelining to False 27712 1727096486.75340: Set connection var ansible_timeout to 10 27712 1727096486.75347: Set connection var ansible_shell_type to sh 27712 1727096486.75360: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096486.75375: Set connection var ansible_shell_executable to /bin/sh 27712 1727096486.75405: variable 'ansible_shell_executable' from source: unknown 27712 1727096486.75414: variable 'ansible_connection' from source: unknown 27712 1727096486.75421: variable 'ansible_module_compression' from source: unknown 27712 1727096486.75427: variable 'ansible_shell_type' from source: unknown 27712 1727096486.75433: variable 'ansible_shell_executable' from source: unknown 27712 1727096486.75443: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096486.75452: variable 'ansible_pipelining' from source: unknown 27712 1727096486.75458: variable 'ansible_timeout' from source: unknown 27712 1727096486.75466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096486.75675: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096486.75723: variable 'omit' from source: magic vars 27712 1727096486.75726: starting attempt loop 27712 1727096486.75728: running the handler 27712 1727096486.75730: _low_level_execute_command(): starting 27712 1727096486.75734: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096486.76455: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096486.76575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096486.76595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096486.76643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096486.76681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096486.78367: stdout chunk (state=3): >>>/root <<< 27712 1727096486.78474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096486.78508: stderr chunk (state=3): >>><<< 27712 1727096486.78510: stdout chunk (state=3): >>><<< 27712 1727096486.78522: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096486.78557: _low_level_execute_command(): starting 27712 1727096486.78562: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096486.7852736-28430-241895538400013 `" && echo ansible-tmp-1727096486.7852736-28430-241895538400013="` echo /root/.ansible/tmp/ansible-tmp-1727096486.7852736-28430-241895538400013 `" ) && sleep 0' 27712 1727096486.78966: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096486.78974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096486.78977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096486.78986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096486.79027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096486.79031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096486.79076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096486.80975: stdout chunk (state=3): >>>ansible-tmp-1727096486.7852736-28430-241895538400013=/root/.ansible/tmp/ansible-tmp-1727096486.7852736-28430-241895538400013 <<< 27712 1727096486.81079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096486.81105: stderr chunk (state=3): >>><<< 27712 1727096486.81108: stdout chunk (state=3): >>><<< 27712 1727096486.81123: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096486.7852736-28430-241895538400013=/root/.ansible/tmp/ansible-tmp-1727096486.7852736-28430-241895538400013 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096486.81159: variable 'ansible_module_compression' from source: unknown 27712 1727096486.81200: ANSIBALLZ: Using lock for service_facts 27712 1727096486.81203: ANSIBALLZ: Acquiring lock 27712 1727096486.81206: ANSIBALLZ: Lock acquired: 140297907823216 27712 1727096486.81208: ANSIBALLZ: Creating module 27712 1727096486.93275: ANSIBALLZ: Writing module into payload 27712 1727096486.93280: ANSIBALLZ: Writing module 27712 1727096486.93285: ANSIBALLZ: Renaming module 27712 1727096486.93298: ANSIBALLZ: Done creating module 27712 1727096486.93321: variable 'ansible_facts' from source: unknown 27712 1727096486.93409: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096486.7852736-28430-241895538400013/AnsiballZ_service_facts.py 27712 1727096486.93699: Sending initial data 27712 1727096486.93702: Sent initial data (162 bytes) 27712 1727096486.94231: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096486.94248: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096486.94265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096486.94288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096486.94307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096486.94320: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096486.94390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096486.94425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096486.94440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096486.94462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096486.94539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096486.96199: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096486.96248: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096486.96386: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpindz7816 /root/.ansible/tmp/ansible-tmp-1727096486.7852736-28430-241895538400013/AnsiballZ_service_facts.py <<< 27712 1727096486.96389: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096486.7852736-28430-241895538400013/AnsiballZ_service_facts.py" <<< 27712 1727096486.96492: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpindz7816" to remote "/root/.ansible/tmp/ansible-tmp-1727096486.7852736-28430-241895538400013/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096486.7852736-28430-241895538400013/AnsiballZ_service_facts.py" <<< 27712 1727096486.97935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096486.97978: stderr chunk (state=3): >>><<< 27712 1727096486.97989: stdout chunk (state=3): >>><<< 27712 1727096486.98186: done transferring module to remote 27712 1727096486.98207: _low_level_execute_command(): starting 27712 1727096486.98366: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096486.7852736-28430-241895538400013/ /root/.ansible/tmp/ansible-tmp-1727096486.7852736-28430-241895538400013/AnsiballZ_service_facts.py && sleep 0' 27712 1727096486.99395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096486.99505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096486.99781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096486.99792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096487.01600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096487.01604: stdout chunk (state=3): >>><<< 27712 1727096487.01606: stderr chunk (state=3): >>><<< 27712 1727096487.01620: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096487.01629: _low_level_execute_command(): starting 27712 1727096487.01643: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096486.7852736-28430-241895538400013/AnsiballZ_service_facts.py && sleep 0' 27712 1727096487.02708: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096487.02741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096487.02762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096487.02835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096488.57482: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status<<< 27712 1727096488.57517: stdout chunk (state=3): >>>": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 27712 1727096488.59059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096488.59090: stdout chunk (state=3): >>><<< 27712 1727096488.59116: stderr chunk (state=3): >>><<< 27712 1727096488.59139: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096488.59637: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096486.7852736-28430-241895538400013/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096488.59722: _low_level_execute_command(): starting 27712 1727096488.59725: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096486.7852736-28430-241895538400013/ > /dev/null 2>&1 && sleep 0' 27712 1727096488.60232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096488.60248: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096488.60262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096488.60284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096488.60301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096488.60313: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096488.60327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096488.60426: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096488.60449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096488.60484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096488.62301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096488.62324: stderr chunk (state=3): >>><<< 27712 1727096488.62327: stdout chunk (state=3): >>><<< 27712 1727096488.62343: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096488.62348: handler run complete 27712 1727096488.62483: variable 'ansible_facts' from source: unknown 27712 1727096488.62671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096488.63043: variable 'ansible_facts' from source: unknown 27712 1727096488.63187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096488.63509: attempt loop complete, returning result 27712 1727096488.63519: _execute() done 27712 1727096488.63526: dumping result to json 27712 1727096488.63605: done dumping result, returning 27712 1727096488.63634: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-cbc7-8716-0000000004a0] 27712 1727096488.63649: sending task result for task 0afff68d-5257-cbc7-8716-0000000004a0 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096488.66043: no more pending results, returning what we have 27712 1727096488.66048: results queue empty 27712 1727096488.66049: checking for any_errors_fatal 27712 1727096488.66054: done checking for any_errors_fatal 27712 1727096488.66055: checking for max_fail_percentage 27712 1727096488.66057: done checking for max_fail_percentage 27712 1727096488.66057: checking to see if all hosts have failed and the running result is not ok 27712 1727096488.66058: done checking to see if all hosts have failed 27712 1727096488.66059: getting the remaining hosts for this loop 27712 1727096488.66060: done getting the remaining hosts for this loop 27712 1727096488.66063: getting the next task for host managed_node2 27712 1727096488.66072: done getting next task for host managed_node2 27712 1727096488.66076: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 27712 1727096488.66080: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096488.66090: getting variables 27712 1727096488.66091: in VariableManager get_vars() 27712 1727096488.66146: Calling all_inventory to load vars for managed_node2 27712 1727096488.66149: Calling groups_inventory to load vars for managed_node2 27712 1727096488.66151: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096488.66161: Calling all_plugins_play to load vars for managed_node2 27712 1727096488.66163: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096488.66166: Calling groups_plugins_play to load vars for managed_node2 27712 1727096488.66205: done sending task result for task 0afff68d-5257-cbc7-8716-0000000004a0 27712 1727096488.66215: WORKER PROCESS EXITING 27712 1727096488.67188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096488.68095: done with get_vars() 27712 1727096488.68111: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 09:01:28 -0400 (0:00:01.947) 0:00:14.376 ****** 27712 1727096488.68312: entering _queue_task() for managed_node2/package_facts 27712 1727096488.68314: Creating lock for package_facts 27712 1727096488.68827: worker is 1 (out of 1 available) 27712 1727096488.68837: exiting _queue_task() for managed_node2/package_facts 27712 1727096488.68849: done queuing things up, now waiting for results queue to drain 27712 1727096488.68850: waiting for pending results... 27712 1727096488.69014: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 27712 1727096488.69183: in run() - task 0afff68d-5257-cbc7-8716-0000000004a1 27712 1727096488.69205: variable 'ansible_search_path' from source: unknown 27712 1727096488.69212: variable 'ansible_search_path' from source: unknown 27712 1727096488.69258: calling self._execute() 27712 1727096488.69353: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096488.69403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096488.69407: variable 'omit' from source: magic vars 27712 1727096488.69858: variable 'ansible_distribution_major_version' from source: facts 27712 1727096488.69886: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096488.69954: variable 'omit' from source: magic vars 27712 1727096488.70010: variable 'omit' from source: magic vars 27712 1727096488.70054: variable 'omit' from source: magic vars 27712 1727096488.70105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096488.70162: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096488.70192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096488.70231: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096488.70234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096488.70273: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096488.70341: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096488.70344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096488.70405: Set connection var ansible_connection to ssh 27712 1727096488.70437: Set connection var ansible_pipelining to False 27712 1727096488.70500: Set connection var ansible_timeout to 10 27712 1727096488.70599: Set connection var ansible_shell_type to sh 27712 1727096488.70603: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096488.70605: Set connection var ansible_shell_executable to /bin/sh 27712 1727096488.70607: variable 'ansible_shell_executable' from source: unknown 27712 1727096488.70610: variable 'ansible_connection' from source: unknown 27712 1727096488.70612: variable 'ansible_module_compression' from source: unknown 27712 1727096488.70614: variable 'ansible_shell_type' from source: unknown 27712 1727096488.70616: variable 'ansible_shell_executable' from source: unknown 27712 1727096488.70617: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096488.70619: variable 'ansible_pipelining' from source: unknown 27712 1727096488.70621: variable 'ansible_timeout' from source: unknown 27712 1727096488.70680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096488.71049: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096488.71078: variable 'omit' from source: magic vars 27712 1727096488.71091: starting attempt loop 27712 1727096488.71145: running the handler 27712 1727096488.71149: _low_level_execute_command(): starting 27712 1727096488.71151: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096488.72011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096488.72301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096488.72596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096488.72754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096488.74425: stdout chunk (state=3): >>>/root <<< 27712 1727096488.74652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096488.74663: stdout chunk (state=3): >>><<< 27712 1727096488.74681: stderr chunk (state=3): >>><<< 27712 1727096488.74869: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096488.74873: _low_level_execute_command(): starting 27712 1727096488.74876: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096488.7478113-28490-7814733580539 `" && echo ansible-tmp-1727096488.7478113-28490-7814733580539="` echo /root/.ansible/tmp/ansible-tmp-1727096488.7478113-28490-7814733580539 `" ) && sleep 0' 27712 1727096488.76036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096488.76048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096488.76065: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096488.76130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096488.76272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096488.76308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096488.78203: stdout chunk (state=3): >>>ansible-tmp-1727096488.7478113-28490-7814733580539=/root/.ansible/tmp/ansible-tmp-1727096488.7478113-28490-7814733580539 <<< 27712 1727096488.78304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096488.78345: stderr chunk (state=3): >>><<< 27712 1727096488.78348: stdout chunk (state=3): >>><<< 27712 1727096488.78576: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096488.7478113-28490-7814733580539=/root/.ansible/tmp/ansible-tmp-1727096488.7478113-28490-7814733580539 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096488.78581: variable 'ansible_module_compression' from source: unknown 27712 1727096488.78584: ANSIBALLZ: Using lock for package_facts 27712 1727096488.78587: ANSIBALLZ: Acquiring lock 27712 1727096488.78590: ANSIBALLZ: Lock acquired: 140297907433456 27712 1727096488.78592: ANSIBALLZ: Creating module 27712 1727096489.08292: ANSIBALLZ: Writing module into payload 27712 1727096489.08384: ANSIBALLZ: Writing module 27712 1727096489.08405: ANSIBALLZ: Renaming module 27712 1727096489.08411: ANSIBALLZ: Done creating module 27712 1727096489.08430: variable 'ansible_facts' from source: unknown 27712 1727096489.08551: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096488.7478113-28490-7814733580539/AnsiballZ_package_facts.py 27712 1727096489.08659: Sending initial data 27712 1727096489.08662: Sent initial data (160 bytes) 27712 1727096489.09124: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096489.09128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096489.09131: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096489.09133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096489.09173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096489.09191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096489.09196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096489.09238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096489.10874: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096489.10902: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096489.10932: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpg12hdigk /root/.ansible/tmp/ansible-tmp-1727096488.7478113-28490-7814733580539/AnsiballZ_package_facts.py <<< 27712 1727096489.10946: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096488.7478113-28490-7814733580539/AnsiballZ_package_facts.py" <<< 27712 1727096489.10964: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpg12hdigk" to remote "/root/.ansible/tmp/ansible-tmp-1727096488.7478113-28490-7814733580539/AnsiballZ_package_facts.py" <<< 27712 1727096489.10970: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096488.7478113-28490-7814733580539/AnsiballZ_package_facts.py" <<< 27712 1727096489.11947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096489.11994: stderr chunk (state=3): >>><<< 27712 1727096489.11998: stdout chunk (state=3): >>><<< 27712 1727096489.12017: done transferring module to remote 27712 1727096489.12026: _low_level_execute_command(): starting 27712 1727096489.12032: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096488.7478113-28490-7814733580539/ /root/.ansible/tmp/ansible-tmp-1727096488.7478113-28490-7814733580539/AnsiballZ_package_facts.py && sleep 0' 27712 1727096489.12459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096489.12464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096489.12496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096489.12499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096489.12502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096489.12504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096489.12559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096489.12565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096489.12577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096489.12597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096489.14414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096489.14439: stderr chunk (state=3): >>><<< 27712 1727096489.14443: stdout chunk (state=3): >>><<< 27712 1727096489.14458: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096489.14461: _low_level_execute_command(): starting 27712 1727096489.14466: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096488.7478113-28490-7814733580539/AnsiballZ_package_facts.py && sleep 0' 27712 1727096489.14917: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096489.14921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096489.14923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096489.14926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096489.14976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096489.14980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096489.14983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096489.15031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096489.59503: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 27712 1727096489.59609: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 27712 1727096489.59685: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 27712 1727096489.59716: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 27712 1727096489.61450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096489.61536: stderr chunk (state=3): >>>Shared connection to 10.31.15.126 closed. <<< 27712 1727096489.61540: stdout chunk (state=3): >>><<< 27712 1727096489.61542: stderr chunk (state=3): >>><<< 27712 1727096489.61782: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096489.63780: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096488.7478113-28490-7814733580539/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096489.63807: _low_level_execute_command(): starting 27712 1727096489.63817: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096488.7478113-28490-7814733580539/ > /dev/null 2>&1 && sleep 0' 27712 1727096489.64496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096489.64510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096489.64585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096489.64645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096489.64664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096489.64695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096489.64757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096489.66631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096489.66638: stdout chunk (state=3): >>><<< 27712 1727096489.66646: stderr chunk (state=3): >>><<< 27712 1727096489.66673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096489.66676: handler run complete 27712 1727096489.67166: variable 'ansible_facts' from source: unknown 27712 1727096489.67405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096489.68889: variable 'ansible_facts' from source: unknown 27712 1727096489.69118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096489.69500: attempt loop complete, returning result 27712 1727096489.69510: _execute() done 27712 1727096489.69513: dumping result to json 27712 1727096489.69629: done dumping result, returning 27712 1727096489.69639: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-cbc7-8716-0000000004a1] 27712 1727096489.69641: sending task result for task 0afff68d-5257-cbc7-8716-0000000004a1 27712 1727096489.71275: done sending task result for task 0afff68d-5257-cbc7-8716-0000000004a1 27712 1727096489.71279: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096489.71380: no more pending results, returning what we have 27712 1727096489.71382: results queue empty 27712 1727096489.71383: checking for any_errors_fatal 27712 1727096489.71386: done checking for any_errors_fatal 27712 1727096489.71387: checking for max_fail_percentage 27712 1727096489.71389: done checking for max_fail_percentage 27712 1727096489.71389: checking to see if all hosts have failed and the running result is not ok 27712 1727096489.71390: done checking to see if all hosts have failed 27712 1727096489.71391: getting the remaining hosts for this loop 27712 1727096489.71392: done getting the remaining hosts for this loop 27712 1727096489.71395: getting the next task for host managed_node2 27712 1727096489.71402: done getting next task for host managed_node2 27712 1727096489.71405: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 27712 1727096489.71408: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096489.71417: getting variables 27712 1727096489.71418: in VariableManager get_vars() 27712 1727096489.71451: Calling all_inventory to load vars for managed_node2 27712 1727096489.71454: Calling groups_inventory to load vars for managed_node2 27712 1727096489.71456: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096489.71464: Calling all_plugins_play to load vars for managed_node2 27712 1727096489.71468: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096489.71472: Calling groups_plugins_play to load vars for managed_node2 27712 1727096489.72615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096489.74149: done with get_vars() 27712 1727096489.74173: done getting variables 27712 1727096489.74230: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 09:01:29 -0400 (0:00:01.059) 0:00:15.435 ****** 27712 1727096489.74268: entering _queue_task() for managed_node2/debug 27712 1727096489.74700: worker is 1 (out of 1 available) 27712 1727096489.74710: exiting _queue_task() for managed_node2/debug 27712 1727096489.74720: done queuing things up, now waiting for results queue to drain 27712 1727096489.74722: waiting for pending results... 27712 1727096489.74956: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 27712 1727096489.75006: in run() - task 0afff68d-5257-cbc7-8716-00000000001c 27712 1727096489.75029: variable 'ansible_search_path' from source: unknown 27712 1727096489.75037: variable 'ansible_search_path' from source: unknown 27712 1727096489.75085: calling self._execute() 27712 1727096489.75176: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096489.75188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096489.75202: variable 'omit' from source: magic vars 27712 1727096489.75571: variable 'ansible_distribution_major_version' from source: facts 27712 1727096489.75593: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096489.75605: variable 'omit' from source: magic vars 27712 1727096489.75659: variable 'omit' from source: magic vars 27712 1727096489.75762: variable 'network_provider' from source: set_fact 27712 1727096489.75787: variable 'omit' from source: magic vars 27712 1727096489.75917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096489.75922: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096489.75925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096489.75928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096489.75940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096489.75974: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096489.75983: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096489.75990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096489.76092: Set connection var ansible_connection to ssh 27712 1727096489.76105: Set connection var ansible_pipelining to False 27712 1727096489.76114: Set connection var ansible_timeout to 10 27712 1727096489.76120: Set connection var ansible_shell_type to sh 27712 1727096489.76136: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096489.76145: Set connection var ansible_shell_executable to /bin/sh 27712 1727096489.76173: variable 'ansible_shell_executable' from source: unknown 27712 1727096489.76182: variable 'ansible_connection' from source: unknown 27712 1727096489.76189: variable 'ansible_module_compression' from source: unknown 27712 1727096489.76195: variable 'ansible_shell_type' from source: unknown 27712 1727096489.76201: variable 'ansible_shell_executable' from source: unknown 27712 1727096489.76207: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096489.76243: variable 'ansible_pipelining' from source: unknown 27712 1727096489.76247: variable 'ansible_timeout' from source: unknown 27712 1727096489.76249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096489.76375: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096489.76393: variable 'omit' from source: magic vars 27712 1727096489.76403: starting attempt loop 27712 1727096489.76461: running the handler 27712 1727096489.76465: handler run complete 27712 1727096489.76480: attempt loop complete, returning result 27712 1727096489.76487: _execute() done 27712 1727096489.76494: dumping result to json 27712 1727096489.76501: done dumping result, returning 27712 1727096489.76512: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-cbc7-8716-00000000001c] 27712 1727096489.76520: sending task result for task 0afff68d-5257-cbc7-8716-00000000001c 27712 1727096489.76723: done sending task result for task 0afff68d-5257-cbc7-8716-00000000001c 27712 1727096489.76726: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 27712 1727096489.76785: no more pending results, returning what we have 27712 1727096489.76789: results queue empty 27712 1727096489.76790: checking for any_errors_fatal 27712 1727096489.76800: done checking for any_errors_fatal 27712 1727096489.76801: checking for max_fail_percentage 27712 1727096489.76803: done checking for max_fail_percentage 27712 1727096489.76804: checking to see if all hosts have failed and the running result is not ok 27712 1727096489.76805: done checking to see if all hosts have failed 27712 1727096489.76805: getting the remaining hosts for this loop 27712 1727096489.76807: done getting the remaining hosts for this loop 27712 1727096489.76811: getting the next task for host managed_node2 27712 1727096489.76817: done getting next task for host managed_node2 27712 1727096489.76821: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27712 1727096489.76825: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096489.76835: getting variables 27712 1727096489.76837: in VariableManager get_vars() 27712 1727096489.76879: Calling all_inventory to load vars for managed_node2 27712 1727096489.76882: Calling groups_inventory to load vars for managed_node2 27712 1727096489.76884: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096489.76894: Calling all_plugins_play to load vars for managed_node2 27712 1727096489.76897: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096489.76900: Calling groups_plugins_play to load vars for managed_node2 27712 1727096489.79689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096489.82759: done with get_vars() 27712 1727096489.82991: done getting variables 27712 1727096489.83050: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 09:01:29 -0400 (0:00:00.088) 0:00:15.524 ****** 27712 1727096489.83088: entering _queue_task() for managed_node2/fail 27712 1727096489.83598: worker is 1 (out of 1 available) 27712 1727096489.83612: exiting _queue_task() for managed_node2/fail 27712 1727096489.83625: done queuing things up, now waiting for results queue to drain 27712 1727096489.83626: waiting for pending results... 27712 1727096489.84190: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27712 1727096489.84289: in run() - task 0afff68d-5257-cbc7-8716-00000000001d 27712 1727096489.84310: variable 'ansible_search_path' from source: unknown 27712 1727096489.84320: variable 'ansible_search_path' from source: unknown 27712 1727096489.84363: calling self._execute() 27712 1727096489.84457: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096489.84472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096489.84487: variable 'omit' from source: magic vars 27712 1727096489.84841: variable 'ansible_distribution_major_version' from source: facts 27712 1727096489.84858: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096489.85173: variable 'network_state' from source: role '' defaults 27712 1727096489.85176: Evaluated conditional (network_state != {}): False 27712 1727096489.85180: when evaluation is False, skipping this task 27712 1727096489.85183: _execute() done 27712 1727096489.85185: dumping result to json 27712 1727096489.85187: done dumping result, returning 27712 1727096489.85190: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-cbc7-8716-00000000001d] 27712 1727096489.85192: sending task result for task 0afff68d-5257-cbc7-8716-00000000001d 27712 1727096489.85264: done sending task result for task 0afff68d-5257-cbc7-8716-00000000001d 27712 1727096489.85270: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096489.85316: no more pending results, returning what we have 27712 1727096489.85320: results queue empty 27712 1727096489.85321: checking for any_errors_fatal 27712 1727096489.85327: done checking for any_errors_fatal 27712 1727096489.85328: checking for max_fail_percentage 27712 1727096489.85330: done checking for max_fail_percentage 27712 1727096489.85331: checking to see if all hosts have failed and the running result is not ok 27712 1727096489.85331: done checking to see if all hosts have failed 27712 1727096489.85332: getting the remaining hosts for this loop 27712 1727096489.85333: done getting the remaining hosts for this loop 27712 1727096489.85337: getting the next task for host managed_node2 27712 1727096489.85344: done getting next task for host managed_node2 27712 1727096489.85347: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27712 1727096489.85350: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096489.85366: getting variables 27712 1727096489.85369: in VariableManager get_vars() 27712 1727096489.85411: Calling all_inventory to load vars for managed_node2 27712 1727096489.85415: Calling groups_inventory to load vars for managed_node2 27712 1727096489.85417: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096489.85429: Calling all_plugins_play to load vars for managed_node2 27712 1727096489.85432: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096489.85435: Calling groups_plugins_play to load vars for managed_node2 27712 1727096489.86806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096489.88328: done with get_vars() 27712 1727096489.88351: done getting variables 27712 1727096489.88409: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 09:01:29 -0400 (0:00:00.053) 0:00:15.577 ****** 27712 1727096489.88442: entering _queue_task() for managed_node2/fail 27712 1727096489.88936: worker is 1 (out of 1 available) 27712 1727096489.88948: exiting _queue_task() for managed_node2/fail 27712 1727096489.88959: done queuing things up, now waiting for results queue to drain 27712 1727096489.88961: waiting for pending results... 27712 1727096489.89491: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27712 1727096489.89725: in run() - task 0afff68d-5257-cbc7-8716-00000000001e 27712 1727096489.89729: variable 'ansible_search_path' from source: unknown 27712 1727096489.89732: variable 'ansible_search_path' from source: unknown 27712 1727096489.89741: calling self._execute() 27712 1727096489.89843: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096489.89856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096489.89874: variable 'omit' from source: magic vars 27712 1727096489.90255: variable 'ansible_distribution_major_version' from source: facts 27712 1727096489.90283: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096489.90475: variable 'network_state' from source: role '' defaults 27712 1727096489.90485: Evaluated conditional (network_state != {}): False 27712 1727096489.90489: when evaluation is False, skipping this task 27712 1727096489.90491: _execute() done 27712 1727096489.90494: dumping result to json 27712 1727096489.90497: done dumping result, returning 27712 1727096489.90499: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-cbc7-8716-00000000001e] 27712 1727096489.90502: sending task result for task 0afff68d-5257-cbc7-8716-00000000001e skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096489.90624: no more pending results, returning what we have 27712 1727096489.90627: results queue empty 27712 1727096489.90628: checking for any_errors_fatal 27712 1727096489.90638: done checking for any_errors_fatal 27712 1727096489.90639: checking for max_fail_percentage 27712 1727096489.90641: done checking for max_fail_percentage 27712 1727096489.90642: checking to see if all hosts have failed and the running result is not ok 27712 1727096489.90643: done checking to see if all hosts have failed 27712 1727096489.90644: getting the remaining hosts for this loop 27712 1727096489.90645: done getting the remaining hosts for this loop 27712 1727096489.90650: getting the next task for host managed_node2 27712 1727096489.90657: done getting next task for host managed_node2 27712 1727096489.90661: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27712 1727096489.90665: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096489.90684: getting variables 27712 1727096489.90686: in VariableManager get_vars() 27712 1727096489.90845: Calling all_inventory to load vars for managed_node2 27712 1727096489.90849: Calling groups_inventory to load vars for managed_node2 27712 1727096489.90851: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096489.90863: Calling all_plugins_play to load vars for managed_node2 27712 1727096489.90867: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096489.90980: Calling groups_plugins_play to load vars for managed_node2 27712 1727096489.91544: done sending task result for task 0afff68d-5257-cbc7-8716-00000000001e 27712 1727096489.91548: WORKER PROCESS EXITING 27712 1727096489.94302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096489.96885: done with get_vars() 27712 1727096489.96915: done getting variables 27712 1727096489.96980: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 09:01:29 -0400 (0:00:00.085) 0:00:15.663 ****** 27712 1727096489.97021: entering _queue_task() for managed_node2/fail 27712 1727096489.97465: worker is 1 (out of 1 available) 27712 1727096489.97487: exiting _queue_task() for managed_node2/fail 27712 1727096489.97645: done queuing things up, now waiting for results queue to drain 27712 1727096489.97647: waiting for pending results... 27712 1727096489.97752: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27712 1727096489.97908: in run() - task 0afff68d-5257-cbc7-8716-00000000001f 27712 1727096489.97926: variable 'ansible_search_path' from source: unknown 27712 1727096489.97934: variable 'ansible_search_path' from source: unknown 27712 1727096489.97983: calling self._execute() 27712 1727096489.98075: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096489.98096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096489.98205: variable 'omit' from source: magic vars 27712 1727096489.98494: variable 'ansible_distribution_major_version' from source: facts 27712 1727096489.98509: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096489.98674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096490.01896: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096490.01976: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096490.02017: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096490.02055: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096490.02088: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096490.02162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.02199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.02248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.02292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.02307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.02456: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.02508: Evaluated conditional (ansible_distribution_major_version | int > 9): True 27712 1727096490.02673: variable 'ansible_distribution' from source: facts 27712 1727096490.02677: variable '__network_rh_distros' from source: role '' defaults 27712 1727096490.02722: Evaluated conditional (ansible_distribution in __network_rh_distros): True 27712 1727096490.02936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.02962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.02994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.03073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.03077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.03095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.03120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.03147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.03191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.03213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.03259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.03373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.03377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.03379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.03382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.03674: variable 'network_connections' from source: task vars 27712 1727096490.03690: variable 'interface0' from source: play vars 27712 1727096490.03760: variable 'interface0' from source: play vars 27712 1727096490.03777: variable 'interface0' from source: play vars 27712 1727096490.03840: variable 'interface0' from source: play vars 27712 1727096490.03859: variable 'interface1' from source: play vars 27712 1727096490.03922: variable 'interface1' from source: play vars 27712 1727096490.03934: variable 'interface1' from source: play vars 27712 1727096490.03995: variable 'interface1' from source: play vars 27712 1727096490.04014: variable 'network_state' from source: role '' defaults 27712 1727096490.04083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096490.04248: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096490.04351: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096490.04355: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096490.04574: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096490.04578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096490.04580: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096490.04583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.04585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096490.04691: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 27712 1727096490.04701: when evaluation is False, skipping this task 27712 1727096490.04709: _execute() done 27712 1727096490.04722: dumping result to json 27712 1727096490.04741: done dumping result, returning 27712 1727096490.04754: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-cbc7-8716-00000000001f] 27712 1727096490.04764: sending task result for task 0afff68d-5257-cbc7-8716-00000000001f 27712 1727096490.04973: done sending task result for task 0afff68d-5257-cbc7-8716-00000000001f 27712 1727096490.04977: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 27712 1727096490.05025: no more pending results, returning what we have 27712 1727096490.05029: results queue empty 27712 1727096490.05030: checking for any_errors_fatal 27712 1727096490.05036: done checking for any_errors_fatal 27712 1727096490.05037: checking for max_fail_percentage 27712 1727096490.05039: done checking for max_fail_percentage 27712 1727096490.05040: checking to see if all hosts have failed and the running result is not ok 27712 1727096490.05040: done checking to see if all hosts have failed 27712 1727096490.05041: getting the remaining hosts for this loop 27712 1727096490.05043: done getting the remaining hosts for this loop 27712 1727096490.05048: getting the next task for host managed_node2 27712 1727096490.05055: done getting next task for host managed_node2 27712 1727096490.05059: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27712 1727096490.05062: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096490.05078: getting variables 27712 1727096490.05080: in VariableManager get_vars() 27712 1727096490.05122: Calling all_inventory to load vars for managed_node2 27712 1727096490.05125: Calling groups_inventory to load vars for managed_node2 27712 1727096490.05128: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096490.05139: Calling all_plugins_play to load vars for managed_node2 27712 1727096490.05142: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096490.05145: Calling groups_plugins_play to load vars for managed_node2 27712 1727096490.06807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096490.08377: done with get_vars() 27712 1727096490.08398: done getting variables 27712 1727096490.08493: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 09:01:30 -0400 (0:00:00.115) 0:00:15.778 ****** 27712 1727096490.08523: entering _queue_task() for managed_node2/dnf 27712 1727096490.08816: worker is 1 (out of 1 available) 27712 1727096490.08829: exiting _queue_task() for managed_node2/dnf 27712 1727096490.08840: done queuing things up, now waiting for results queue to drain 27712 1727096490.08841: waiting for pending results... 27712 1727096490.09114: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27712 1727096490.09275: in run() - task 0afff68d-5257-cbc7-8716-000000000020 27712 1727096490.09279: variable 'ansible_search_path' from source: unknown 27712 1727096490.09281: variable 'ansible_search_path' from source: unknown 27712 1727096490.09316: calling self._execute() 27712 1727096490.09473: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096490.09477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096490.09480: variable 'omit' from source: magic vars 27712 1727096490.09786: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.09803: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096490.10001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096490.16388: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096490.16451: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096490.16496: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096490.16532: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096490.16575: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096490.16873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.16876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.16879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.16882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.16884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.16886: variable 'ansible_distribution' from source: facts 27712 1727096490.16889: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.16891: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 27712 1727096490.17013: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096490.17125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.17149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.17176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.17220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.17272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.17288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.17314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.17342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.17381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.17444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.17448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.17466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.17495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.17533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.17553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.17701: variable 'network_connections' from source: task vars 27712 1727096490.17715: variable 'interface0' from source: play vars 27712 1727096490.17785: variable 'interface0' from source: play vars 27712 1727096490.17872: variable 'interface0' from source: play vars 27712 1727096490.17881: variable 'interface0' from source: play vars 27712 1727096490.17883: variable 'interface1' from source: play vars 27712 1727096490.17940: variable 'interface1' from source: play vars 27712 1727096490.17952: variable 'interface1' from source: play vars 27712 1727096490.18018: variable 'interface1' from source: play vars 27712 1727096490.18090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096490.18254: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096490.18296: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096490.18334: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096490.18369: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096490.18437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096490.18462: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096490.18494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.18642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096490.18645: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096490.18816: variable 'network_connections' from source: task vars 27712 1727096490.18826: variable 'interface0' from source: play vars 27712 1727096490.18893: variable 'interface0' from source: play vars 27712 1727096490.18904: variable 'interface0' from source: play vars 27712 1727096490.18970: variable 'interface0' from source: play vars 27712 1727096490.18987: variable 'interface1' from source: play vars 27712 1727096490.19047: variable 'interface1' from source: play vars 27712 1727096490.19059: variable 'interface1' from source: play vars 27712 1727096490.19123: variable 'interface1' from source: play vars 27712 1727096490.19162: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27712 1727096490.19171: when evaluation is False, skipping this task 27712 1727096490.19182: _execute() done 27712 1727096490.19188: dumping result to json 27712 1727096490.19195: done dumping result, returning 27712 1727096490.19205: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-cbc7-8716-000000000020] 27712 1727096490.19212: sending task result for task 0afff68d-5257-cbc7-8716-000000000020 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27712 1727096490.19440: no more pending results, returning what we have 27712 1727096490.19443: results queue empty 27712 1727096490.19444: checking for any_errors_fatal 27712 1727096490.19451: done checking for any_errors_fatal 27712 1727096490.19452: checking for max_fail_percentage 27712 1727096490.19454: done checking for max_fail_percentage 27712 1727096490.19455: checking to see if all hosts have failed and the running result is not ok 27712 1727096490.19456: done checking to see if all hosts have failed 27712 1727096490.19456: getting the remaining hosts for this loop 27712 1727096490.19458: done getting the remaining hosts for this loop 27712 1727096490.19461: getting the next task for host managed_node2 27712 1727096490.19469: done getting next task for host managed_node2 27712 1727096490.19473: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27712 1727096490.19476: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096490.19490: getting variables 27712 1727096490.19491: in VariableManager get_vars() 27712 1727096490.19530: Calling all_inventory to load vars for managed_node2 27712 1727096490.19532: Calling groups_inventory to load vars for managed_node2 27712 1727096490.19534: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096490.19544: Calling all_plugins_play to load vars for managed_node2 27712 1727096490.19546: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096490.19549: Calling groups_plugins_play to load vars for managed_node2 27712 1727096490.20081: done sending task result for task 0afff68d-5257-cbc7-8716-000000000020 27712 1727096490.20084: WORKER PROCESS EXITING 27712 1727096490.24571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096490.26010: done with get_vars() 27712 1727096490.26030: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27712 1727096490.26093: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 09:01:30 -0400 (0:00:00.175) 0:00:15.954 ****** 27712 1727096490.26119: entering _queue_task() for managed_node2/yum 27712 1727096490.26121: Creating lock for yum 27712 1727096490.26454: worker is 1 (out of 1 available) 27712 1727096490.26466: exiting _queue_task() for managed_node2/yum 27712 1727096490.26583: done queuing things up, now waiting for results queue to drain 27712 1727096490.26585: waiting for pending results... 27712 1727096490.26772: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27712 1727096490.26899: in run() - task 0afff68d-5257-cbc7-8716-000000000021 27712 1727096490.26923: variable 'ansible_search_path' from source: unknown 27712 1727096490.26931: variable 'ansible_search_path' from source: unknown 27712 1727096490.26969: calling self._execute() 27712 1727096490.27055: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096490.27066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096490.27082: variable 'omit' from source: magic vars 27712 1727096490.27673: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.27676: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096490.27679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096490.29886: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096490.29970: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096490.30013: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096490.30051: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096490.30088: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096490.30164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.30204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.30237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.30290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.30312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.30412: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.30433: Evaluated conditional (ansible_distribution_major_version | int < 8): False 27712 1727096490.30441: when evaluation is False, skipping this task 27712 1727096490.30449: _execute() done 27712 1727096490.30456: dumping result to json 27712 1727096490.30463: done dumping result, returning 27712 1727096490.30478: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-cbc7-8716-000000000021] 27712 1727096490.30486: sending task result for task 0afff68d-5257-cbc7-8716-000000000021 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 27712 1727096490.30655: no more pending results, returning what we have 27712 1727096490.30658: results queue empty 27712 1727096490.30659: checking for any_errors_fatal 27712 1727096490.30669: done checking for any_errors_fatal 27712 1727096490.30669: checking for max_fail_percentage 27712 1727096490.30671: done checking for max_fail_percentage 27712 1727096490.30672: checking to see if all hosts have failed and the running result is not ok 27712 1727096490.30673: done checking to see if all hosts have failed 27712 1727096490.30674: getting the remaining hosts for this loop 27712 1727096490.30675: done getting the remaining hosts for this loop 27712 1727096490.30678: getting the next task for host managed_node2 27712 1727096490.30685: done getting next task for host managed_node2 27712 1727096490.30688: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27712 1727096490.30691: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096490.30705: getting variables 27712 1727096490.30707: in VariableManager get_vars() 27712 1727096490.30755: Calling all_inventory to load vars for managed_node2 27712 1727096490.30758: Calling groups_inventory to load vars for managed_node2 27712 1727096490.30760: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096490.30877: Calling all_plugins_play to load vars for managed_node2 27712 1727096490.30882: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096490.30886: Calling groups_plugins_play to load vars for managed_node2 27712 1727096490.31481: done sending task result for task 0afff68d-5257-cbc7-8716-000000000021 27712 1727096490.31484: WORKER PROCESS EXITING 27712 1727096490.32390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096490.34027: done with get_vars() 27712 1727096490.34047: done getting variables 27712 1727096490.34105: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 09:01:30 -0400 (0:00:00.080) 0:00:16.034 ****** 27712 1727096490.34137: entering _queue_task() for managed_node2/fail 27712 1727096490.34427: worker is 1 (out of 1 available) 27712 1727096490.34440: exiting _queue_task() for managed_node2/fail 27712 1727096490.34452: done queuing things up, now waiting for results queue to drain 27712 1727096490.34453: waiting for pending results... 27712 1727096490.34724: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27712 1727096490.34856: in run() - task 0afff68d-5257-cbc7-8716-000000000022 27712 1727096490.34878: variable 'ansible_search_path' from source: unknown 27712 1727096490.34888: variable 'ansible_search_path' from source: unknown 27712 1727096490.34931: calling self._execute() 27712 1727096490.35027: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096490.35042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096490.35058: variable 'omit' from source: magic vars 27712 1727096490.35431: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.35453: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096490.35577: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096490.35775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096490.37950: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096490.38027: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096490.38073: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096490.38272: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096490.38276: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096490.38279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.38281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.38284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.38322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.38341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.38395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.38472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.38476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.38496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.38519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.38564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.38594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.38627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.38671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.38691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.38943: variable 'network_connections' from source: task vars 27712 1727096490.38946: variable 'interface0' from source: play vars 27712 1727096490.38954: variable 'interface0' from source: play vars 27712 1727096490.38969: variable 'interface0' from source: play vars 27712 1727096490.39030: variable 'interface0' from source: play vars 27712 1727096490.39053: variable 'interface1' from source: play vars 27712 1727096490.39115: variable 'interface1' from source: play vars 27712 1727096490.39127: variable 'interface1' from source: play vars 27712 1727096490.39194: variable 'interface1' from source: play vars 27712 1727096490.39271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096490.39448: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096490.39494: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096490.39529: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096490.39560: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096490.39611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096490.39637: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096490.39697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.39701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096490.39760: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096490.40006: variable 'network_connections' from source: task vars 27712 1727096490.40022: variable 'interface0' from source: play vars 27712 1727096490.40085: variable 'interface0' from source: play vars 27712 1727096490.40131: variable 'interface0' from source: play vars 27712 1727096490.40161: variable 'interface0' from source: play vars 27712 1727096490.40181: variable 'interface1' from source: play vars 27712 1727096490.40245: variable 'interface1' from source: play vars 27712 1727096490.40348: variable 'interface1' from source: play vars 27712 1727096490.40351: variable 'interface1' from source: play vars 27712 1727096490.40369: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27712 1727096490.40378: when evaluation is False, skipping this task 27712 1727096490.40385: _execute() done 27712 1727096490.40391: dumping result to json 27712 1727096490.40398: done dumping result, returning 27712 1727096490.40408: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-cbc7-8716-000000000022] 27712 1727096490.40416: sending task result for task 0afff68d-5257-cbc7-8716-000000000022 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27712 1727096490.40617: no more pending results, returning what we have 27712 1727096490.40621: results queue empty 27712 1727096490.40622: checking for any_errors_fatal 27712 1727096490.40631: done checking for any_errors_fatal 27712 1727096490.40632: checking for max_fail_percentage 27712 1727096490.40634: done checking for max_fail_percentage 27712 1727096490.40635: checking to see if all hosts have failed and the running result is not ok 27712 1727096490.40636: done checking to see if all hosts have failed 27712 1727096490.40636: getting the remaining hosts for this loop 27712 1727096490.40637: done getting the remaining hosts for this loop 27712 1727096490.40641: getting the next task for host managed_node2 27712 1727096490.40649: done getting next task for host managed_node2 27712 1727096490.40654: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 27712 1727096490.40656: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096490.40672: getting variables 27712 1727096490.40673: in VariableManager get_vars() 27712 1727096490.40715: Calling all_inventory to load vars for managed_node2 27712 1727096490.40718: Calling groups_inventory to load vars for managed_node2 27712 1727096490.40721: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096490.40731: Calling all_plugins_play to load vars for managed_node2 27712 1727096490.40734: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096490.40737: Calling groups_plugins_play to load vars for managed_node2 27712 1727096490.41281: done sending task result for task 0afff68d-5257-cbc7-8716-000000000022 27712 1727096490.41284: WORKER PROCESS EXITING 27712 1727096490.42287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096490.43823: done with get_vars() 27712 1727096490.43847: done getting variables 27712 1727096490.43907: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 09:01:30 -0400 (0:00:00.097) 0:00:16.132 ****** 27712 1727096490.43939: entering _queue_task() for managed_node2/package 27712 1727096490.44232: worker is 1 (out of 1 available) 27712 1727096490.44244: exiting _queue_task() for managed_node2/package 27712 1727096490.44258: done queuing things up, now waiting for results queue to drain 27712 1727096490.44260: waiting for pending results... 27712 1727096490.44531: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 27712 1727096490.44677: in run() - task 0afff68d-5257-cbc7-8716-000000000023 27712 1727096490.44703: variable 'ansible_search_path' from source: unknown 27712 1727096490.44712: variable 'ansible_search_path' from source: unknown 27712 1727096490.44755: calling self._execute() 27712 1727096490.44853: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096490.44866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096490.44885: variable 'omit' from source: magic vars 27712 1727096490.45261: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.45281: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096490.45480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096490.45730: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096490.45763: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096490.45791: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096490.45845: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096490.45925: variable 'network_packages' from source: role '' defaults 27712 1727096490.45997: variable '__network_provider_setup' from source: role '' defaults 27712 1727096490.46006: variable '__network_service_name_default_nm' from source: role '' defaults 27712 1727096490.46057: variable '__network_service_name_default_nm' from source: role '' defaults 27712 1727096490.46066: variable '__network_packages_default_nm' from source: role '' defaults 27712 1727096490.46110: variable '__network_packages_default_nm' from source: role '' defaults 27712 1727096490.46225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096490.48006: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096490.48062: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096490.48175: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096490.48179: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096490.48181: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096490.48265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.48305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.48346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.48407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.48428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.48488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.48504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.48522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.48550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.48562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.48711: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27712 1727096490.48799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.48815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.48832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.48856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.48869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.48931: variable 'ansible_python' from source: facts 27712 1727096490.48950: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27712 1727096490.49010: variable '__network_wpa_supplicant_required' from source: role '' defaults 27712 1727096490.49064: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27712 1727096490.49147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.49163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.49184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.49211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.49221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.49332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.49344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.49346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.49473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.49476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.49521: variable 'network_connections' from source: task vars 27712 1727096490.49532: variable 'interface0' from source: play vars 27712 1727096490.49630: variable 'interface0' from source: play vars 27712 1727096490.49644: variable 'interface0' from source: play vars 27712 1727096490.49747: variable 'interface0' from source: play vars 27712 1727096490.49770: variable 'interface1' from source: play vars 27712 1727096490.49865: variable 'interface1' from source: play vars 27712 1727096490.49886: variable 'interface1' from source: play vars 27712 1727096490.49983: variable 'interface1' from source: play vars 27712 1727096490.50055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096490.50090: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096490.50123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.50158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096490.50214: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096490.50499: variable 'network_connections' from source: task vars 27712 1727096490.50502: variable 'interface0' from source: play vars 27712 1727096490.50576: variable 'interface0' from source: play vars 27712 1727096490.50581: variable 'interface0' from source: play vars 27712 1727096490.50674: variable 'interface0' from source: play vars 27712 1727096490.50694: variable 'interface1' from source: play vars 27712 1727096490.50872: variable 'interface1' from source: play vars 27712 1727096490.50876: variable 'interface1' from source: play vars 27712 1727096490.50901: variable 'interface1' from source: play vars 27712 1727096490.50963: variable '__network_packages_default_wireless' from source: role '' defaults 27712 1727096490.51045: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096490.51374: variable 'network_connections' from source: task vars 27712 1727096490.51386: variable 'interface0' from source: play vars 27712 1727096490.51477: variable 'interface0' from source: play vars 27712 1727096490.51490: variable 'interface0' from source: play vars 27712 1727096490.51566: variable 'interface0' from source: play vars 27712 1727096490.51588: variable 'interface1' from source: play vars 27712 1727096490.51674: variable 'interface1' from source: play vars 27712 1727096490.51677: variable 'interface1' from source: play vars 27712 1727096490.51762: variable 'interface1' from source: play vars 27712 1727096490.51780: variable '__network_packages_default_team' from source: role '' defaults 27712 1727096490.51872: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096490.52239: variable 'network_connections' from source: task vars 27712 1727096490.52242: variable 'interface0' from source: play vars 27712 1727096490.52291: variable 'interface0' from source: play vars 27712 1727096490.52306: variable 'interface0' from source: play vars 27712 1727096490.52413: variable 'interface0' from source: play vars 27712 1727096490.52416: variable 'interface1' from source: play vars 27712 1727096490.52460: variable 'interface1' from source: play vars 27712 1727096490.52463: variable 'interface1' from source: play vars 27712 1727096490.52790: variable 'interface1' from source: play vars 27712 1727096490.52793: variable '__network_service_name_default_initscripts' from source: role '' defaults 27712 1727096490.52796: variable '__network_service_name_default_initscripts' from source: role '' defaults 27712 1727096490.52798: variable '__network_packages_default_initscripts' from source: role '' defaults 27712 1727096490.52801: variable '__network_packages_default_initscripts' from source: role '' defaults 27712 1727096490.52979: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27712 1727096490.53334: variable 'network_connections' from source: task vars 27712 1727096490.53337: variable 'interface0' from source: play vars 27712 1727096490.53392: variable 'interface0' from source: play vars 27712 1727096490.53406: variable 'interface0' from source: play vars 27712 1727096490.53456: variable 'interface0' from source: play vars 27712 1727096490.53466: variable 'interface1' from source: play vars 27712 1727096490.53514: variable 'interface1' from source: play vars 27712 1727096490.53520: variable 'interface1' from source: play vars 27712 1727096490.53576: variable 'interface1' from source: play vars 27712 1727096490.53587: variable 'ansible_distribution' from source: facts 27712 1727096490.53590: variable '__network_rh_distros' from source: role '' defaults 27712 1727096490.53595: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.53617: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27712 1727096490.53860: variable 'ansible_distribution' from source: facts 27712 1727096490.53863: variable '__network_rh_distros' from source: role '' defaults 27712 1727096490.53866: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.53873: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27712 1727096490.53929: variable 'ansible_distribution' from source: facts 27712 1727096490.53933: variable '__network_rh_distros' from source: role '' defaults 27712 1727096490.53938: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.53979: variable 'network_provider' from source: set_fact 27712 1727096490.54006: variable 'ansible_facts' from source: unknown 27712 1727096490.54765: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 27712 1727096490.54773: when evaluation is False, skipping this task 27712 1727096490.54776: _execute() done 27712 1727096490.54779: dumping result to json 27712 1727096490.54781: done dumping result, returning 27712 1727096490.54783: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-cbc7-8716-000000000023] 27712 1727096490.54785: sending task result for task 0afff68d-5257-cbc7-8716-000000000023 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 27712 1727096490.54901: no more pending results, returning what we have 27712 1727096490.54904: results queue empty 27712 1727096490.54905: checking for any_errors_fatal 27712 1727096490.54912: done checking for any_errors_fatal 27712 1727096490.54913: checking for max_fail_percentage 27712 1727096490.54919: done checking for max_fail_percentage 27712 1727096490.54920: checking to see if all hosts have failed and the running result is not ok 27712 1727096490.54920: done checking to see if all hosts have failed 27712 1727096490.54921: getting the remaining hosts for this loop 27712 1727096490.54922: done getting the remaining hosts for this loop 27712 1727096490.54926: getting the next task for host managed_node2 27712 1727096490.54932: done getting next task for host managed_node2 27712 1727096490.54936: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27712 1727096490.54938: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096490.54952: getting variables 27712 1727096490.54953: in VariableManager get_vars() 27712 1727096490.55108: Calling all_inventory to load vars for managed_node2 27712 1727096490.55111: Calling groups_inventory to load vars for managed_node2 27712 1727096490.55113: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096490.55181: Calling all_plugins_play to load vars for managed_node2 27712 1727096490.55184: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096490.55192: Calling groups_plugins_play to load vars for managed_node2 27712 1727096490.55842: done sending task result for task 0afff68d-5257-cbc7-8716-000000000023 27712 1727096490.55846: WORKER PROCESS EXITING 27712 1727096490.57484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096490.59263: done with get_vars() 27712 1727096490.59296: done getting variables 27712 1727096490.59360: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 09:01:30 -0400 (0:00:00.154) 0:00:16.287 ****** 27712 1727096490.59398: entering _queue_task() for managed_node2/package 27712 1727096490.59746: worker is 1 (out of 1 available) 27712 1727096490.59758: exiting _queue_task() for managed_node2/package 27712 1727096490.59772: done queuing things up, now waiting for results queue to drain 27712 1727096490.59774: waiting for pending results... 27712 1727096490.60129: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27712 1727096490.60270: in run() - task 0afff68d-5257-cbc7-8716-000000000024 27712 1727096490.60294: variable 'ansible_search_path' from source: unknown 27712 1727096490.60303: variable 'ansible_search_path' from source: unknown 27712 1727096490.60345: calling self._execute() 27712 1727096490.60444: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096490.60461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096490.60482: variable 'omit' from source: magic vars 27712 1727096490.61199: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.61279: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096490.61374: variable 'network_state' from source: role '' defaults 27712 1727096490.61675: Evaluated conditional (network_state != {}): False 27712 1727096490.61678: when evaluation is False, skipping this task 27712 1727096490.61680: _execute() done 27712 1727096490.61682: dumping result to json 27712 1727096490.61684: done dumping result, returning 27712 1727096490.61687: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-cbc7-8716-000000000024] 27712 1727096490.61690: sending task result for task 0afff68d-5257-cbc7-8716-000000000024 27712 1727096490.61769: done sending task result for task 0afff68d-5257-cbc7-8716-000000000024 27712 1727096490.61772: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096490.61828: no more pending results, returning what we have 27712 1727096490.61833: results queue empty 27712 1727096490.61834: checking for any_errors_fatal 27712 1727096490.61839: done checking for any_errors_fatal 27712 1727096490.61840: checking for max_fail_percentage 27712 1727096490.61842: done checking for max_fail_percentage 27712 1727096490.61843: checking to see if all hosts have failed and the running result is not ok 27712 1727096490.61844: done checking to see if all hosts have failed 27712 1727096490.61845: getting the remaining hosts for this loop 27712 1727096490.61847: done getting the remaining hosts for this loop 27712 1727096490.61851: getting the next task for host managed_node2 27712 1727096490.61858: done getting next task for host managed_node2 27712 1727096490.61862: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27712 1727096490.61866: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096490.61884: getting variables 27712 1727096490.61886: in VariableManager get_vars() 27712 1727096490.61930: Calling all_inventory to load vars for managed_node2 27712 1727096490.61933: Calling groups_inventory to load vars for managed_node2 27712 1727096490.61935: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096490.61947: Calling all_plugins_play to load vars for managed_node2 27712 1727096490.61950: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096490.61953: Calling groups_plugins_play to load vars for managed_node2 27712 1727096490.63731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096490.65227: done with get_vars() 27712 1727096490.65253: done getting variables 27712 1727096490.65317: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 09:01:30 -0400 (0:00:00.059) 0:00:16.346 ****** 27712 1727096490.65351: entering _queue_task() for managed_node2/package 27712 1727096490.65675: worker is 1 (out of 1 available) 27712 1727096490.65687: exiting _queue_task() for managed_node2/package 27712 1727096490.65699: done queuing things up, now waiting for results queue to drain 27712 1727096490.65700: waiting for pending results... 27712 1727096490.65983: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27712 1727096490.66127: in run() - task 0afff68d-5257-cbc7-8716-000000000025 27712 1727096490.66147: variable 'ansible_search_path' from source: unknown 27712 1727096490.66156: variable 'ansible_search_path' from source: unknown 27712 1727096490.66198: calling self._execute() 27712 1727096490.66296: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096490.66308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096490.66329: variable 'omit' from source: magic vars 27712 1727096490.66709: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.66752: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096490.66861: variable 'network_state' from source: role '' defaults 27712 1727096490.66878: Evaluated conditional (network_state != {}): False 27712 1727096490.66886: when evaluation is False, skipping this task 27712 1727096490.66893: _execute() done 27712 1727096490.66971: dumping result to json 27712 1727096490.66975: done dumping result, returning 27712 1727096490.66978: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-cbc7-8716-000000000025] 27712 1727096490.66980: sending task result for task 0afff68d-5257-cbc7-8716-000000000025 27712 1727096490.67055: done sending task result for task 0afff68d-5257-cbc7-8716-000000000025 27712 1727096490.67059: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096490.67112: no more pending results, returning what we have 27712 1727096490.67115: results queue empty 27712 1727096490.67116: checking for any_errors_fatal 27712 1727096490.67126: done checking for any_errors_fatal 27712 1727096490.67127: checking for max_fail_percentage 27712 1727096490.67128: done checking for max_fail_percentage 27712 1727096490.67129: checking to see if all hosts have failed and the running result is not ok 27712 1727096490.67130: done checking to see if all hosts have failed 27712 1727096490.67131: getting the remaining hosts for this loop 27712 1727096490.67132: done getting the remaining hosts for this loop 27712 1727096490.67136: getting the next task for host managed_node2 27712 1727096490.67141: done getting next task for host managed_node2 27712 1727096490.67145: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27712 1727096490.67148: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096490.67162: getting variables 27712 1727096490.67163: in VariableManager get_vars() 27712 1727096490.67209: Calling all_inventory to load vars for managed_node2 27712 1727096490.67212: Calling groups_inventory to load vars for managed_node2 27712 1727096490.67215: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096490.67227: Calling all_plugins_play to load vars for managed_node2 27712 1727096490.67231: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096490.67234: Calling groups_plugins_play to load vars for managed_node2 27712 1727096490.68861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096490.70434: done with get_vars() 27712 1727096490.70456: done getting variables 27712 1727096490.70552: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 09:01:30 -0400 (0:00:00.052) 0:00:16.399 ****** 27712 1727096490.70588: entering _queue_task() for managed_node2/service 27712 1727096490.70590: Creating lock for service 27712 1727096490.70898: worker is 1 (out of 1 available) 27712 1727096490.70910: exiting _queue_task() for managed_node2/service 27712 1727096490.70920: done queuing things up, now waiting for results queue to drain 27712 1727096490.70921: waiting for pending results... 27712 1727096490.71296: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27712 1727096490.71357: in run() - task 0afff68d-5257-cbc7-8716-000000000026 27712 1727096490.71383: variable 'ansible_search_path' from source: unknown 27712 1727096490.71399: variable 'ansible_search_path' from source: unknown 27712 1727096490.71440: calling self._execute() 27712 1727096490.71541: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096490.71554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096490.71573: variable 'omit' from source: magic vars 27712 1727096490.72172: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.72177: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096490.72179: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096490.72309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096490.74444: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096490.74527: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096490.74569: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096490.74611: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096490.74642: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096490.74725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.74760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.74901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.74905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.74907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.74909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.74932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.74961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.75010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.75029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.75074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.75102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.75137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.75182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.75200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.75377: variable 'network_connections' from source: task vars 27712 1727096490.75394: variable 'interface0' from source: play vars 27712 1727096490.75476: variable 'interface0' from source: play vars 27712 1727096490.75490: variable 'interface0' from source: play vars 27712 1727096490.75556: variable 'interface0' from source: play vars 27712 1727096490.75576: variable 'interface1' from source: play vars 27712 1727096490.75637: variable 'interface1' from source: play vars 27712 1727096490.75648: variable 'interface1' from source: play vars 27712 1727096490.75714: variable 'interface1' from source: play vars 27712 1727096490.75881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096490.75976: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096490.76020: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096490.76054: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096490.76087: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096490.76135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096490.76161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096490.76192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.76226: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096490.76289: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096490.76533: variable 'network_connections' from source: task vars 27712 1727096490.76543: variable 'interface0' from source: play vars 27712 1727096490.76605: variable 'interface0' from source: play vars 27712 1727096490.76617: variable 'interface0' from source: play vars 27712 1727096490.76683: variable 'interface0' from source: play vars 27712 1727096490.76700: variable 'interface1' from source: play vars 27712 1727096490.76770: variable 'interface1' from source: play vars 27712 1727096490.76856: variable 'interface1' from source: play vars 27712 1727096490.76859: variable 'interface1' from source: play vars 27712 1727096490.76894: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27712 1727096490.76902: when evaluation is False, skipping this task 27712 1727096490.76909: _execute() done 27712 1727096490.76916: dumping result to json 27712 1727096490.76923: done dumping result, returning 27712 1727096490.76933: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-cbc7-8716-000000000026] 27712 1727096490.76941: sending task result for task 0afff68d-5257-cbc7-8716-000000000026 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27712 1727096490.77115: no more pending results, returning what we have 27712 1727096490.77119: results queue empty 27712 1727096490.77120: checking for any_errors_fatal 27712 1727096490.77127: done checking for any_errors_fatal 27712 1727096490.77127: checking for max_fail_percentage 27712 1727096490.77129: done checking for max_fail_percentage 27712 1727096490.77130: checking to see if all hosts have failed and the running result is not ok 27712 1727096490.77131: done checking to see if all hosts have failed 27712 1727096490.77132: getting the remaining hosts for this loop 27712 1727096490.77133: done getting the remaining hosts for this loop 27712 1727096490.77137: getting the next task for host managed_node2 27712 1727096490.77145: done getting next task for host managed_node2 27712 1727096490.77149: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27712 1727096490.77152: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096490.77166: getting variables 27712 1727096490.77372: in VariableManager get_vars() 27712 1727096490.77412: Calling all_inventory to load vars for managed_node2 27712 1727096490.77415: Calling groups_inventory to load vars for managed_node2 27712 1727096490.77417: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096490.77428: Calling all_plugins_play to load vars for managed_node2 27712 1727096490.77431: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096490.77434: Calling groups_plugins_play to load vars for managed_node2 27712 1727096490.77980: done sending task result for task 0afff68d-5257-cbc7-8716-000000000026 27712 1727096490.77984: WORKER PROCESS EXITING 27712 1727096490.78865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096490.80527: done with get_vars() 27712 1727096490.80548: done getting variables 27712 1727096490.80607: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 09:01:30 -0400 (0:00:00.100) 0:00:16.499 ****** 27712 1727096490.80637: entering _queue_task() for managed_node2/service 27712 1727096490.80956: worker is 1 (out of 1 available) 27712 1727096490.81072: exiting _queue_task() for managed_node2/service 27712 1727096490.81083: done queuing things up, now waiting for results queue to drain 27712 1727096490.81084: waiting for pending results... 27712 1727096490.81257: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27712 1727096490.81395: in run() - task 0afff68d-5257-cbc7-8716-000000000027 27712 1727096490.81420: variable 'ansible_search_path' from source: unknown 27712 1727096490.81427: variable 'ansible_search_path' from source: unknown 27712 1727096490.81466: calling self._execute() 27712 1727096490.81563: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096490.81577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096490.81591: variable 'omit' from source: magic vars 27712 1727096490.81972: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.81990: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096490.82149: variable 'network_provider' from source: set_fact 27712 1727096490.82158: variable 'network_state' from source: role '' defaults 27712 1727096490.82174: Evaluated conditional (network_provider == "nm" or network_state != {}): True 27712 1727096490.82187: variable 'omit' from source: magic vars 27712 1727096490.82241: variable 'omit' from source: magic vars 27712 1727096490.82278: variable 'network_service_name' from source: role '' defaults 27712 1727096490.82352: variable 'network_service_name' from source: role '' defaults 27712 1727096490.82457: variable '__network_provider_setup' from source: role '' defaults 27712 1727096490.82475: variable '__network_service_name_default_nm' from source: role '' defaults 27712 1727096490.82539: variable '__network_service_name_default_nm' from source: role '' defaults 27712 1727096490.82554: variable '__network_packages_default_nm' from source: role '' defaults 27712 1727096490.82622: variable '__network_packages_default_nm' from source: role '' defaults 27712 1727096490.82819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096490.85275: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096490.85360: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096490.85411: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096490.85457: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096490.85527: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096490.85649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.85653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.85668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.85717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.85740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.85797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.85828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.85864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.85964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.85970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.86161: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27712 1727096490.86297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.86323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.86351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.86402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.86424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.86521: variable 'ansible_python' from source: facts 27712 1727096490.86617: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27712 1727096490.86638: variable '__network_wpa_supplicant_required' from source: role '' defaults 27712 1727096490.86732: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27712 1727096490.86863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.86898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.86931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.86980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.86999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.87053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096490.87095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096490.87261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.87276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096490.87279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096490.87390: variable 'network_connections' from source: task vars 27712 1727096490.87394: variable 'interface0' from source: play vars 27712 1727096490.87441: variable 'interface0' from source: play vars 27712 1727096490.87458: variable 'interface0' from source: play vars 27712 1727096490.87546: variable 'interface0' from source: play vars 27712 1727096490.87585: variable 'interface1' from source: play vars 27712 1727096490.87670: variable 'interface1' from source: play vars 27712 1727096490.87711: variable 'interface1' from source: play vars 27712 1727096490.87780: variable 'interface1' from source: play vars 27712 1727096490.87932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096490.88214: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096490.88237: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096490.88302: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096490.88349: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096490.88428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096490.88585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096490.88591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096490.88595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096490.88619: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096490.88954: variable 'network_connections' from source: task vars 27712 1727096490.88981: variable 'interface0' from source: play vars 27712 1727096490.89099: variable 'interface0' from source: play vars 27712 1727096490.89234: variable 'interface0' from source: play vars 27712 1727096490.89573: variable 'interface0' from source: play vars 27712 1727096490.89578: variable 'interface1' from source: play vars 27712 1727096490.89599: variable 'interface1' from source: play vars 27712 1727096490.89617: variable 'interface1' from source: play vars 27712 1727096490.89708: variable 'interface1' from source: play vars 27712 1727096490.89786: variable '__network_packages_default_wireless' from source: role '' defaults 27712 1727096490.89878: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096490.90208: variable 'network_connections' from source: task vars 27712 1727096490.90222: variable 'interface0' from source: play vars 27712 1727096490.90342: variable 'interface0' from source: play vars 27712 1727096490.90355: variable 'interface0' from source: play vars 27712 1727096490.90641: variable 'interface0' from source: play vars 27712 1727096490.90645: variable 'interface1' from source: play vars 27712 1727096490.90681: variable 'interface1' from source: play vars 27712 1727096490.90694: variable 'interface1' from source: play vars 27712 1727096490.90977: variable 'interface1' from source: play vars 27712 1727096490.90998: variable '__network_packages_default_team' from source: role '' defaults 27712 1727096490.91201: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096490.91565: variable 'network_connections' from source: task vars 27712 1727096490.91575: variable 'interface0' from source: play vars 27712 1727096490.91641: variable 'interface0' from source: play vars 27712 1727096490.91644: variable 'interface0' from source: play vars 27712 1727096490.91728: variable 'interface0' from source: play vars 27712 1727096490.91731: variable 'interface1' from source: play vars 27712 1727096490.91795: variable 'interface1' from source: play vars 27712 1727096490.91801: variable 'interface1' from source: play vars 27712 1727096490.91877: variable 'interface1' from source: play vars 27712 1727096490.91947: variable '__network_service_name_default_initscripts' from source: role '' defaults 27712 1727096490.91992: variable '__network_service_name_default_initscripts' from source: role '' defaults 27712 1727096490.91998: variable '__network_packages_default_initscripts' from source: role '' defaults 27712 1727096490.92042: variable '__network_packages_default_initscripts' from source: role '' defaults 27712 1727096490.92199: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27712 1727096490.92501: variable 'network_connections' from source: task vars 27712 1727096490.92504: variable 'interface0' from source: play vars 27712 1727096490.92547: variable 'interface0' from source: play vars 27712 1727096490.92552: variable 'interface0' from source: play vars 27712 1727096490.92597: variable 'interface0' from source: play vars 27712 1727096490.92606: variable 'interface1' from source: play vars 27712 1727096490.92648: variable 'interface1' from source: play vars 27712 1727096490.92654: variable 'interface1' from source: play vars 27712 1727096490.92697: variable 'interface1' from source: play vars 27712 1727096490.92707: variable 'ansible_distribution' from source: facts 27712 1727096490.92710: variable '__network_rh_distros' from source: role '' defaults 27712 1727096490.92715: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.92734: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27712 1727096490.92846: variable 'ansible_distribution' from source: facts 27712 1727096490.92849: variable '__network_rh_distros' from source: role '' defaults 27712 1727096490.92853: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.92864: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27712 1727096490.92977: variable 'ansible_distribution' from source: facts 27712 1727096490.92982: variable '__network_rh_distros' from source: role '' defaults 27712 1727096490.92984: variable 'ansible_distribution_major_version' from source: facts 27712 1727096490.93010: variable 'network_provider' from source: set_fact 27712 1727096490.93029: variable 'omit' from source: magic vars 27712 1727096490.93053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096490.93077: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096490.93091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096490.93104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096490.93112: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096490.93136: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096490.93139: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096490.93142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096490.93210: Set connection var ansible_connection to ssh 27712 1727096490.93217: Set connection var ansible_pipelining to False 27712 1727096490.93222: Set connection var ansible_timeout to 10 27712 1727096490.93224: Set connection var ansible_shell_type to sh 27712 1727096490.93251: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096490.93262: Set connection var ansible_shell_executable to /bin/sh 27712 1727096490.93279: variable 'ansible_shell_executable' from source: unknown 27712 1727096490.93282: variable 'ansible_connection' from source: unknown 27712 1727096490.93285: variable 'ansible_module_compression' from source: unknown 27712 1727096490.93287: variable 'ansible_shell_type' from source: unknown 27712 1727096490.93289: variable 'ansible_shell_executable' from source: unknown 27712 1727096490.93291: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096490.93295: variable 'ansible_pipelining' from source: unknown 27712 1727096490.93298: variable 'ansible_timeout' from source: unknown 27712 1727096490.93302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096490.93476: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096490.93480: variable 'omit' from source: magic vars 27712 1727096490.93482: starting attempt loop 27712 1727096490.93531: running the handler 27712 1727096490.93597: variable 'ansible_facts' from source: unknown 27712 1727096490.95635: _low_level_execute_command(): starting 27712 1727096490.95650: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096490.97179: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096490.97285: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096490.97390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096490.99025: stdout chunk (state=3): >>>/root <<< 27712 1727096490.99117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096490.99343: stderr chunk (state=3): >>><<< 27712 1727096490.99347: stdout chunk (state=3): >>><<< 27712 1727096490.99349: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096490.99353: _low_level_execute_command(): starting 27712 1727096490.99356: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096490.9924955-28564-104617933553224 `" && echo ansible-tmp-1727096490.9924955-28564-104617933553224="` echo /root/.ansible/tmp/ansible-tmp-1727096490.9924955-28564-104617933553224 `" ) && sleep 0' 27712 1727096491.00285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096491.00289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096491.00291: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096491.00418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096491.00455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096491.00693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096491.00732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096491.02666: stdout chunk (state=3): >>>ansible-tmp-1727096490.9924955-28564-104617933553224=/root/.ansible/tmp/ansible-tmp-1727096490.9924955-28564-104617933553224 <<< 27712 1727096491.03080: stdout chunk (state=3): >>><<< 27712 1727096491.03083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096491.03086: stderr chunk (state=3): >>><<< 27712 1727096491.03089: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096490.9924955-28564-104617933553224=/root/.ansible/tmp/ansible-tmp-1727096490.9924955-28564-104617933553224 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096491.03097: variable 'ansible_module_compression' from source: unknown 27712 1727096491.03100: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 27712 1727096491.03103: ANSIBALLZ: Acquiring lock 27712 1727096491.03105: ANSIBALLZ: Lock acquired: 140297911472480 27712 1727096491.03107: ANSIBALLZ: Creating module 27712 1727096491.28451: ANSIBALLZ: Writing module into payload 27712 1727096491.28556: ANSIBALLZ: Writing module 27712 1727096491.28581: ANSIBALLZ: Renaming module 27712 1727096491.28586: ANSIBALLZ: Done creating module 27712 1727096491.28618: variable 'ansible_facts' from source: unknown 27712 1727096491.28752: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096490.9924955-28564-104617933553224/AnsiballZ_systemd.py 27712 1727096491.28860: Sending initial data 27712 1727096491.28863: Sent initial data (156 bytes) 27712 1727096491.29343: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096491.29346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096491.29348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096491.29351: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096491.29353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096491.29398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096491.29410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096491.29459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096491.31175: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096491.31246: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096491.31288: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpgaxgssp3 /root/.ansible/tmp/ansible-tmp-1727096490.9924955-28564-104617933553224/AnsiballZ_systemd.py <<< 27712 1727096491.31292: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096490.9924955-28564-104617933553224/AnsiballZ_systemd.py" <<< 27712 1727096491.31306: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpgaxgssp3" to remote "/root/.ansible/tmp/ansible-tmp-1727096490.9924955-28564-104617933553224/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096490.9924955-28564-104617933553224/AnsiballZ_systemd.py" <<< 27712 1727096491.32319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096491.32378: stderr chunk (state=3): >>><<< 27712 1727096491.32381: stdout chunk (state=3): >>><<< 27712 1727096491.32389: done transferring module to remote 27712 1727096491.32398: _low_level_execute_command(): starting 27712 1727096491.32403: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096490.9924955-28564-104617933553224/ /root/.ansible/tmp/ansible-tmp-1727096490.9924955-28564-104617933553224/AnsiballZ_systemd.py && sleep 0' 27712 1727096491.32825: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096491.32832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096491.32859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096491.32862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096491.32865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096491.32867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096491.32920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096491.32923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096491.32960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096491.34849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096491.34862: stderr chunk (state=3): >>><<< 27712 1727096491.34873: stdout chunk (state=3): >>><<< 27712 1727096491.34895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096491.34902: _low_level_execute_command(): starting 27712 1727096491.34911: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096490.9924955-28564-104617933553224/AnsiballZ_systemd.py && sleep 0' 27712 1727096491.35513: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096491.35532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096491.35547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096491.35564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096491.35591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096491.35686: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096491.35707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096491.35724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096491.35797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096491.65403: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4661248", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3296280576", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1651079000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 27712 1727096491.65437: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 27712 1727096491.67707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096491.67711: stdout chunk (state=3): >>><<< 27712 1727096491.67714: stderr chunk (state=3): >>><<< 27712 1727096491.68169: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4661248", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3296280576", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1651079000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096491.68242: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096490.9924955-28564-104617933553224/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096491.68314: _low_level_execute_command(): starting 27712 1727096491.68323: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096490.9924955-28564-104617933553224/ > /dev/null 2>&1 && sleep 0' 27712 1727096491.69082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096491.69091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096491.69169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096491.71097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096491.71190: stderr chunk (state=3): >>><<< 27712 1727096491.71259: stdout chunk (state=3): >>><<< 27712 1727096491.71306: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096491.71310: handler run complete 27712 1727096491.71415: attempt loop complete, returning result 27712 1727096491.71418: _execute() done 27712 1727096491.71420: dumping result to json 27712 1727096491.71422: done dumping result, returning 27712 1727096491.71425: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-cbc7-8716-000000000027] 27712 1727096491.71427: sending task result for task 0afff68d-5257-cbc7-8716-000000000027 27712 1727096491.71889: done sending task result for task 0afff68d-5257-cbc7-8716-000000000027 27712 1727096491.71892: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096491.71986: no more pending results, returning what we have 27712 1727096491.71990: results queue empty 27712 1727096491.71991: checking for any_errors_fatal 27712 1727096491.72000: done checking for any_errors_fatal 27712 1727096491.72013: checking for max_fail_percentage 27712 1727096491.72016: done checking for max_fail_percentage 27712 1727096491.72017: checking to see if all hosts have failed and the running result is not ok 27712 1727096491.72018: done checking to see if all hosts have failed 27712 1727096491.72018: getting the remaining hosts for this loop 27712 1727096491.72020: done getting the remaining hosts for this loop 27712 1727096491.72024: getting the next task for host managed_node2 27712 1727096491.72032: done getting next task for host managed_node2 27712 1727096491.72035: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27712 1727096491.72039: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096491.72052: getting variables 27712 1727096491.72054: in VariableManager get_vars() 27712 1727096491.72157: Calling all_inventory to load vars for managed_node2 27712 1727096491.72160: Calling groups_inventory to load vars for managed_node2 27712 1727096491.72163: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096491.72237: Calling all_plugins_play to load vars for managed_node2 27712 1727096491.72241: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096491.72244: Calling groups_plugins_play to load vars for managed_node2 27712 1727096491.73843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096491.75495: done with get_vars() 27712 1727096491.75521: done getting variables 27712 1727096491.75578: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 09:01:31 -0400 (0:00:00.949) 0:00:17.449 ****** 27712 1727096491.75608: entering _queue_task() for managed_node2/service 27712 1727096491.76051: worker is 1 (out of 1 available) 27712 1727096491.76062: exiting _queue_task() for managed_node2/service 27712 1727096491.76076: done queuing things up, now waiting for results queue to drain 27712 1727096491.76077: waiting for pending results... 27712 1727096491.76499: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27712 1727096491.76505: in run() - task 0afff68d-5257-cbc7-8716-000000000028 27712 1727096491.76507: variable 'ansible_search_path' from source: unknown 27712 1727096491.76510: variable 'ansible_search_path' from source: unknown 27712 1727096491.76512: calling self._execute() 27712 1727096491.76573: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096491.76586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096491.76604: variable 'omit' from source: magic vars 27712 1727096491.76998: variable 'ansible_distribution_major_version' from source: facts 27712 1727096491.77033: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096491.77151: variable 'network_provider' from source: set_fact 27712 1727096491.77162: Evaluated conditional (network_provider == "nm"): True 27712 1727096491.77361: variable '__network_wpa_supplicant_required' from source: role '' defaults 27712 1727096491.77364: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27712 1727096491.77519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096491.81976: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096491.82042: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096491.82228: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096491.82270: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096491.82386: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096491.82469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096491.82546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096491.82651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096491.82702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096491.82747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096491.82866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096491.82964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096491.83077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096491.83131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096491.83153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096491.83265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096491.83303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096491.83331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096491.83392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096491.83495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096491.83554: variable 'network_connections' from source: task vars 27712 1727096491.83575: variable 'interface0' from source: play vars 27712 1727096491.83653: variable 'interface0' from source: play vars 27712 1727096491.83674: variable 'interface0' from source: play vars 27712 1727096491.83742: variable 'interface0' from source: play vars 27712 1727096491.83760: variable 'interface1' from source: play vars 27712 1727096491.83830: variable 'interface1' from source: play vars 27712 1727096491.83842: variable 'interface1' from source: play vars 27712 1727096491.83908: variable 'interface1' from source: play vars 27712 1727096491.84001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096491.84164: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096491.84212: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096491.84249: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096491.84377: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096491.84380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096491.84382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096491.84392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096491.84424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096491.84510: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096491.84811: variable 'network_connections' from source: task vars 27712 1727096491.84826: variable 'interface0' from source: play vars 27712 1727096491.84909: variable 'interface0' from source: play vars 27712 1727096491.84922: variable 'interface0' from source: play vars 27712 1727096491.85099: variable 'interface0' from source: play vars 27712 1727096491.85117: variable 'interface1' from source: play vars 27712 1727096491.85263: variable 'interface1' from source: play vars 27712 1727096491.85266: variable 'interface1' from source: play vars 27712 1727096491.85584: variable 'interface1' from source: play vars 27712 1727096491.85587: Evaluated conditional (__network_wpa_supplicant_required): False 27712 1727096491.85590: when evaluation is False, skipping this task 27712 1727096491.85592: _execute() done 27712 1727096491.85594: dumping result to json 27712 1727096491.85596: done dumping result, returning 27712 1727096491.85598: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-cbc7-8716-000000000028] 27712 1727096491.85600: sending task result for task 0afff68d-5257-cbc7-8716-000000000028 27712 1727096491.85755: done sending task result for task 0afff68d-5257-cbc7-8716-000000000028 27712 1727096491.85758: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 27712 1727096491.85836: no more pending results, returning what we have 27712 1727096491.85840: results queue empty 27712 1727096491.85841: checking for any_errors_fatal 27712 1727096491.85864: done checking for any_errors_fatal 27712 1727096491.85865: checking for max_fail_percentage 27712 1727096491.85869: done checking for max_fail_percentage 27712 1727096491.85873: checking to see if all hosts have failed and the running result is not ok 27712 1727096491.85874: done checking to see if all hosts have failed 27712 1727096491.85875: getting the remaining hosts for this loop 27712 1727096491.85876: done getting the remaining hosts for this loop 27712 1727096491.85881: getting the next task for host managed_node2 27712 1727096491.85888: done getting next task for host managed_node2 27712 1727096491.85893: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 27712 1727096491.85896: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096491.85910: getting variables 27712 1727096491.85912: in VariableManager get_vars() 27712 1727096491.85958: Calling all_inventory to load vars for managed_node2 27712 1727096491.85961: Calling groups_inventory to load vars for managed_node2 27712 1727096491.85963: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096491.86135: Calling all_plugins_play to load vars for managed_node2 27712 1727096491.86139: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096491.86143: Calling groups_plugins_play to load vars for managed_node2 27712 1727096491.87891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096491.90317: done with get_vars() 27712 1727096491.90340: done getting variables 27712 1727096491.90522: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 09:01:31 -0400 (0:00:00.149) 0:00:17.598 ****** 27712 1727096491.90557: entering _queue_task() for managed_node2/service 27712 1727096491.91170: worker is 1 (out of 1 available) 27712 1727096491.91183: exiting _queue_task() for managed_node2/service 27712 1727096491.91195: done queuing things up, now waiting for results queue to drain 27712 1727096491.91196: waiting for pending results... 27712 1727096491.91569: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 27712 1727096491.91878: in run() - task 0afff68d-5257-cbc7-8716-000000000029 27712 1727096491.91882: variable 'ansible_search_path' from source: unknown 27712 1727096491.91886: variable 'ansible_search_path' from source: unknown 27712 1727096491.91943: calling self._execute() 27712 1727096491.92096: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096491.92103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096491.92113: variable 'omit' from source: magic vars 27712 1727096491.93277: variable 'ansible_distribution_major_version' from source: facts 27712 1727096491.93281: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096491.93479: variable 'network_provider' from source: set_fact 27712 1727096491.93489: Evaluated conditional (network_provider == "initscripts"): False 27712 1727096491.93615: when evaluation is False, skipping this task 27712 1727096491.93618: _execute() done 27712 1727096491.93621: dumping result to json 27712 1727096491.93624: done dumping result, returning 27712 1727096491.93631: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-cbc7-8716-000000000029] 27712 1727096491.93639: sending task result for task 0afff68d-5257-cbc7-8716-000000000029 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096491.93792: no more pending results, returning what we have 27712 1727096491.93795: results queue empty 27712 1727096491.93796: checking for any_errors_fatal 27712 1727096491.93806: done checking for any_errors_fatal 27712 1727096491.93807: checking for max_fail_percentage 27712 1727096491.93809: done checking for max_fail_percentage 27712 1727096491.93810: checking to see if all hosts have failed and the running result is not ok 27712 1727096491.93811: done checking to see if all hosts have failed 27712 1727096491.93812: getting the remaining hosts for this loop 27712 1727096491.93813: done getting the remaining hosts for this loop 27712 1727096491.93819: getting the next task for host managed_node2 27712 1727096491.93826: done getting next task for host managed_node2 27712 1727096491.93830: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27712 1727096491.93834: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096491.93851: getting variables 27712 1727096491.93852: in VariableManager get_vars() 27712 1727096491.93904: Calling all_inventory to load vars for managed_node2 27712 1727096491.93909: Calling groups_inventory to load vars for managed_node2 27712 1727096491.93912: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096491.93934: Calling all_plugins_play to load vars for managed_node2 27712 1727096491.93938: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096491.93942: Calling groups_plugins_play to load vars for managed_node2 27712 1727096491.95563: done sending task result for task 0afff68d-5257-cbc7-8716-000000000029 27712 1727096491.95569: WORKER PROCESS EXITING 27712 1727096491.96996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096492.00421: done with get_vars() 27712 1727096492.00566: done getting variables 27712 1727096492.00642: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 09:01:32 -0400 (0:00:00.102) 0:00:17.701 ****** 27712 1727096492.00784: entering _queue_task() for managed_node2/copy 27712 1727096492.01443: worker is 1 (out of 1 available) 27712 1727096492.01456: exiting _queue_task() for managed_node2/copy 27712 1727096492.01538: done queuing things up, now waiting for results queue to drain 27712 1727096492.01541: waiting for pending results... 27712 1727096492.01731: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27712 1727096492.01849: in run() - task 0afff68d-5257-cbc7-8716-00000000002a 27712 1727096492.01862: variable 'ansible_search_path' from source: unknown 27712 1727096492.01876: variable 'ansible_search_path' from source: unknown 27712 1727096492.01912: calling self._execute() 27712 1727096492.02005: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096492.02010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096492.02018: variable 'omit' from source: magic vars 27712 1727096492.02381: variable 'ansible_distribution_major_version' from source: facts 27712 1727096492.02392: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096492.02500: variable 'network_provider' from source: set_fact 27712 1727096492.02505: Evaluated conditional (network_provider == "initscripts"): False 27712 1727096492.02508: when evaluation is False, skipping this task 27712 1727096492.02511: _execute() done 27712 1727096492.02574: dumping result to json 27712 1727096492.02582: done dumping result, returning 27712 1727096492.02586: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-cbc7-8716-00000000002a] 27712 1727096492.02589: sending task result for task 0afff68d-5257-cbc7-8716-00000000002a 27712 1727096492.02655: done sending task result for task 0afff68d-5257-cbc7-8716-00000000002a 27712 1727096492.02659: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 27712 1727096492.02713: no more pending results, returning what we have 27712 1727096492.02717: results queue empty 27712 1727096492.02718: checking for any_errors_fatal 27712 1727096492.02724: done checking for any_errors_fatal 27712 1727096492.02725: checking for max_fail_percentage 27712 1727096492.02727: done checking for max_fail_percentage 27712 1727096492.02728: checking to see if all hosts have failed and the running result is not ok 27712 1727096492.02734: done checking to see if all hosts have failed 27712 1727096492.02735: getting the remaining hosts for this loop 27712 1727096492.02737: done getting the remaining hosts for this loop 27712 1727096492.02740: getting the next task for host managed_node2 27712 1727096492.02748: done getting next task for host managed_node2 27712 1727096492.02752: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27712 1727096492.02756: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096492.02777: getting variables 27712 1727096492.02779: in VariableManager get_vars() 27712 1727096492.02822: Calling all_inventory to load vars for managed_node2 27712 1727096492.02825: Calling groups_inventory to load vars for managed_node2 27712 1727096492.02828: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096492.02958: Calling all_plugins_play to load vars for managed_node2 27712 1727096492.02962: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096492.02966: Calling groups_plugins_play to load vars for managed_node2 27712 1727096492.04519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096492.05976: done with get_vars() 27712 1727096492.06000: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 09:01:32 -0400 (0:00:00.052) 0:00:17.753 ****** 27712 1727096492.06059: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 27712 1727096492.06060: Creating lock for fedora.linux_system_roles.network_connections 27712 1727096492.06311: worker is 1 (out of 1 available) 27712 1727096492.06326: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 27712 1727096492.06339: done queuing things up, now waiting for results queue to drain 27712 1727096492.06340: waiting for pending results... 27712 1727096492.06514: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27712 1727096492.06601: in run() - task 0afff68d-5257-cbc7-8716-00000000002b 27712 1727096492.06615: variable 'ansible_search_path' from source: unknown 27712 1727096492.06619: variable 'ansible_search_path' from source: unknown 27712 1727096492.06645: calling self._execute() 27712 1727096492.06728: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096492.06732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096492.06740: variable 'omit' from source: magic vars 27712 1727096492.07016: variable 'ansible_distribution_major_version' from source: facts 27712 1727096492.07025: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096492.07031: variable 'omit' from source: magic vars 27712 1727096492.07072: variable 'omit' from source: magic vars 27712 1727096492.07179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096492.09001: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096492.09047: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096492.09174: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096492.09178: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096492.09181: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096492.09183: variable 'network_provider' from source: set_fact 27712 1727096492.09263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096492.09299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096492.09316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096492.09341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096492.09352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096492.09409: variable 'omit' from source: magic vars 27712 1727096492.09488: variable 'omit' from source: magic vars 27712 1727096492.09557: variable 'network_connections' from source: task vars 27712 1727096492.09567: variable 'interface0' from source: play vars 27712 1727096492.09618: variable 'interface0' from source: play vars 27712 1727096492.09625: variable 'interface0' from source: play vars 27712 1727096492.09666: variable 'interface0' from source: play vars 27712 1727096492.09678: variable 'interface1' from source: play vars 27712 1727096492.09722: variable 'interface1' from source: play vars 27712 1727096492.09728: variable 'interface1' from source: play vars 27712 1727096492.09774: variable 'interface1' from source: play vars 27712 1727096492.09914: variable 'omit' from source: magic vars 27712 1727096492.09921: variable '__lsr_ansible_managed' from source: task vars 27712 1727096492.09965: variable '__lsr_ansible_managed' from source: task vars 27712 1727096492.10141: Loaded config def from plugin (lookup/template) 27712 1727096492.10147: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 27712 1727096492.10168: File lookup term: get_ansible_managed.j2 27712 1727096492.10174: variable 'ansible_search_path' from source: unknown 27712 1727096492.10177: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 27712 1727096492.10187: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 27712 1727096492.10200: variable 'ansible_search_path' from source: unknown 27712 1727096492.14576: variable 'ansible_managed' from source: unknown 27712 1727096492.14582: variable 'omit' from source: magic vars 27712 1727096492.14598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096492.14635: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096492.14663: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096492.14681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096492.14689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096492.14720: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096492.14724: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096492.14726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096492.14822: Set connection var ansible_connection to ssh 27712 1727096492.14825: Set connection var ansible_pipelining to False 27712 1727096492.14827: Set connection var ansible_timeout to 10 27712 1727096492.14829: Set connection var ansible_shell_type to sh 27712 1727096492.14831: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096492.14835: Set connection var ansible_shell_executable to /bin/sh 27712 1727096492.14853: variable 'ansible_shell_executable' from source: unknown 27712 1727096492.14856: variable 'ansible_connection' from source: unknown 27712 1727096492.14858: variable 'ansible_module_compression' from source: unknown 27712 1727096492.14860: variable 'ansible_shell_type' from source: unknown 27712 1727096492.14863: variable 'ansible_shell_executable' from source: unknown 27712 1727096492.14865: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096492.14892: variable 'ansible_pipelining' from source: unknown 27712 1727096492.14895: variable 'ansible_timeout' from source: unknown 27712 1727096492.14906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096492.15040: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096492.15044: variable 'omit' from source: magic vars 27712 1727096492.15047: starting attempt loop 27712 1727096492.15050: running the handler 27712 1727096492.15052: _low_level_execute_command(): starting 27712 1727096492.15054: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096492.15774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096492.15779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096492.15781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096492.15784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096492.15786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096492.15789: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096492.15791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096492.15814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096492.15817: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096492.15820: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27712 1727096492.15822: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096492.15833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096492.15841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096492.15849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096492.15855: stderr chunk (state=3): >>>debug2: match found <<< 27712 1727096492.15864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096492.15952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096492.15960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096492.15985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096492.16032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096492.17725: stdout chunk (state=3): >>>/root <<< 27712 1727096492.17824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096492.17854: stderr chunk (state=3): >>><<< 27712 1727096492.17858: stdout chunk (state=3): >>><<< 27712 1727096492.17902: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096492.17906: _low_level_execute_command(): starting 27712 1727096492.17990: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096492.1789002-28626-117816045301615 `" && echo ansible-tmp-1727096492.1789002-28626-117816045301615="` echo /root/.ansible/tmp/ansible-tmp-1727096492.1789002-28626-117816045301615 `" ) && sleep 0' 27712 1727096492.18518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096492.18527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096492.18577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096492.18580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096492.18583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096492.18585: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096492.18587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096492.18591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096492.18610: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096492.18613: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27712 1727096492.18615: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096492.18622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096492.18674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096492.18677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096492.18679: stderr chunk (state=3): >>>debug2: match found <<< 27712 1727096492.18681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096492.18726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096492.18772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096492.18828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096492.18907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096492.20901: stdout chunk (state=3): >>>ansible-tmp-1727096492.1789002-28626-117816045301615=/root/.ansible/tmp/ansible-tmp-1727096492.1789002-28626-117816045301615 <<< 27712 1727096492.21070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096492.21074: stdout chunk (state=3): >>><<< 27712 1727096492.21076: stderr chunk (state=3): >>><<< 27712 1727096492.21273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096492.1789002-28626-117816045301615=/root/.ansible/tmp/ansible-tmp-1727096492.1789002-28626-117816045301615 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096492.21281: variable 'ansible_module_compression' from source: unknown 27712 1727096492.21284: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 27712 1727096492.21286: ANSIBALLZ: Acquiring lock 27712 1727096492.21288: ANSIBALLZ: Lock acquired: 140297907686240 27712 1727096492.21290: ANSIBALLZ: Creating module 27712 1727096492.46781: ANSIBALLZ: Writing module into payload 27712 1727096492.47111: ANSIBALLZ: Writing module 27712 1727096492.47137: ANSIBALLZ: Renaming module 27712 1727096492.47148: ANSIBALLZ: Done creating module 27712 1727096492.47183: variable 'ansible_facts' from source: unknown 27712 1727096492.47306: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096492.1789002-28626-117816045301615/AnsiballZ_network_connections.py 27712 1727096492.47505: Sending initial data 27712 1727096492.47508: Sent initial data (168 bytes) 27712 1727096492.48145: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096492.48159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096492.48176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096492.48196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096492.48254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096492.48307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096492.48322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096492.48345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096492.48487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096492.50168: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096492.50272: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096492.50341: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpmgb2fgdd /root/.ansible/tmp/ansible-tmp-1727096492.1789002-28626-117816045301615/AnsiballZ_network_connections.py <<< 27712 1727096492.50776: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096492.1789002-28626-117816045301615/AnsiballZ_network_connections.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpmgb2fgdd" to remote "/root/.ansible/tmp/ansible-tmp-1727096492.1789002-28626-117816045301615/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096492.1789002-28626-117816045301615/AnsiballZ_network_connections.py" <<< 27712 1727096492.51940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096492.52011: stderr chunk (state=3): >>><<< 27712 1727096492.52019: stdout chunk (state=3): >>><<< 27712 1727096492.52073: done transferring module to remote 27712 1727096492.52090: _low_level_execute_command(): starting 27712 1727096492.52100: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096492.1789002-28626-117816045301615/ /root/.ansible/tmp/ansible-tmp-1727096492.1789002-28626-117816045301615/AnsiballZ_network_connections.py && sleep 0' 27712 1727096492.53375: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096492.53393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096492.53396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096492.53448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096492.53461: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096492.53544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096492.53679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096492.53746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096492.53818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096492.55657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096492.55705: stderr chunk (state=3): >>><<< 27712 1727096492.55715: stdout chunk (state=3): >>><<< 27712 1727096492.55746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096492.55766: _low_level_execute_command(): starting 27712 1727096492.55779: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096492.1789002-28626-117816045301615/AnsiballZ_network_connections.py && sleep 0' 27712 1727096492.56381: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096492.56395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096492.56411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096492.56429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096492.56446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096492.56491: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096492.56564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096492.56607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096492.56652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096493.23343: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 6432d2d2-f377-4efe-a5e3-d4d7172e455e\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 6432d2d2-f377-4efe-a5e3-d4d7172e455e (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 27712 1727096493.25477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096493.25481: stderr chunk (state=3): >>><<< 27712 1727096493.25483: stdout chunk (state=3): >>><<< 27712 1727096493.25486: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 6432d2d2-f377-4efe-a5e3-d4d7172e455e\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 6432d2d2-f377-4efe-a5e3-d4d7172e455e (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096493.25488: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.3/24', '2001:db8::2/32'], 'route': [{'network': '198.51.10.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4}, {'network': '2001:db6::4', 'prefix': 128, 'gateway': '2001:db8::1', 'metric': 2}]}}, {'name': 'ethtest1', 'interface_name': 'ethtest1', 'state': 'up', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.6/24', '2001:db8::4/32'], 'route': [{'network': '198.51.12.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096492.1789002-28626-117816045301615/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096493.25495: _low_level_execute_command(): starting 27712 1727096493.25498: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096492.1789002-28626-117816045301615/ > /dev/null 2>&1 && sleep 0' 27712 1727096493.26016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096493.26030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096493.26040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096493.26088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096493.26100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096493.26137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096493.28074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096493.28292: stderr chunk (state=3): >>><<< 27712 1727096493.28296: stdout chunk (state=3): >>><<< 27712 1727096493.28298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096493.28301: handler run complete 27712 1727096493.28303: attempt loop complete, returning result 27712 1727096493.28305: _execute() done 27712 1727096493.28307: dumping result to json 27712 1727096493.28309: done dumping result, returning 27712 1727096493.28311: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-cbc7-8716-00000000002b] 27712 1727096493.28313: sending task result for task 0afff68d-5257-cbc7-8716-00000000002b 27712 1727096493.28404: done sending task result for task 0afff68d-5257-cbc7-8716-00000000002b 27712 1727096493.28407: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/24", "2001:db8::2/32" ], "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.10.64", "prefix": 26 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db6::4", "prefix": 128 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" }, { "autoconnect": false, "interface_name": "ethtest1", "ip": { "address": [ "198.51.100.6/24", "2001:db8::4/32" ], "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.12.128", "prefix": 26 } ] }, "name": "ethtest1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 6432d2d2-f377-4efe-a5e3-d4d7172e455e [006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa [007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 6432d2d2-f377-4efe-a5e3-d4d7172e455e (not-active) [008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa (not-active) 27712 1727096493.28594: no more pending results, returning what we have 27712 1727096493.28597: results queue empty 27712 1727096493.28598: checking for any_errors_fatal 27712 1727096493.28605: done checking for any_errors_fatal 27712 1727096493.28606: checking for max_fail_percentage 27712 1727096493.28607: done checking for max_fail_percentage 27712 1727096493.28608: checking to see if all hosts have failed and the running result is not ok 27712 1727096493.28609: done checking to see if all hosts have failed 27712 1727096493.28610: getting the remaining hosts for this loop 27712 1727096493.28611: done getting the remaining hosts for this loop 27712 1727096493.28614: getting the next task for host managed_node2 27712 1727096493.28620: done getting next task for host managed_node2 27712 1727096493.28624: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 27712 1727096493.28676: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096493.28693: getting variables 27712 1727096493.28698: in VariableManager get_vars() 27712 1727096493.28835: Calling all_inventory to load vars for managed_node2 27712 1727096493.28902: Calling groups_inventory to load vars for managed_node2 27712 1727096493.28907: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096493.28918: Calling all_plugins_play to load vars for managed_node2 27712 1727096493.28921: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096493.28924: Calling groups_plugins_play to load vars for managed_node2 27712 1727096493.29977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096493.30829: done with get_vars() 27712 1727096493.30846: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 09:01:33 -0400 (0:00:01.248) 0:00:19.002 ****** 27712 1727096493.30905: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 27712 1727096493.30906: Creating lock for fedora.linux_system_roles.network_state 27712 1727096493.31166: worker is 1 (out of 1 available) 27712 1727096493.31182: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 27712 1727096493.31195: done queuing things up, now waiting for results queue to drain 27712 1727096493.31196: waiting for pending results... 27712 1727096493.31596: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 27712 1727096493.31692: in run() - task 0afff68d-5257-cbc7-8716-00000000002c 27712 1727096493.31696: variable 'ansible_search_path' from source: unknown 27712 1727096493.31701: variable 'ansible_search_path' from source: unknown 27712 1727096493.31796: calling self._execute() 27712 1727096493.31931: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096493.31943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096493.31957: variable 'omit' from source: magic vars 27712 1727096493.32302: variable 'ansible_distribution_major_version' from source: facts 27712 1727096493.32312: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096493.32413: variable 'network_state' from source: role '' defaults 27712 1727096493.32421: Evaluated conditional (network_state != {}): False 27712 1727096493.32424: when evaluation is False, skipping this task 27712 1727096493.32427: _execute() done 27712 1727096493.32430: dumping result to json 27712 1727096493.32432: done dumping result, returning 27712 1727096493.32439: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-cbc7-8716-00000000002c] 27712 1727096493.32444: sending task result for task 0afff68d-5257-cbc7-8716-00000000002c 27712 1727096493.32535: done sending task result for task 0afff68d-5257-cbc7-8716-00000000002c 27712 1727096493.32537: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096493.32616: no more pending results, returning what we have 27712 1727096493.32619: results queue empty 27712 1727096493.32620: checking for any_errors_fatal 27712 1727096493.32631: done checking for any_errors_fatal 27712 1727096493.32632: checking for max_fail_percentage 27712 1727096493.32633: done checking for max_fail_percentage 27712 1727096493.32634: checking to see if all hosts have failed and the running result is not ok 27712 1727096493.32635: done checking to see if all hosts have failed 27712 1727096493.32635: getting the remaining hosts for this loop 27712 1727096493.32636: done getting the remaining hosts for this loop 27712 1727096493.32639: getting the next task for host managed_node2 27712 1727096493.32645: done getting next task for host managed_node2 27712 1727096493.32650: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27712 1727096493.32653: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096493.32670: getting variables 27712 1727096493.32673: in VariableManager get_vars() 27712 1727096493.32707: Calling all_inventory to load vars for managed_node2 27712 1727096493.32709: Calling groups_inventory to load vars for managed_node2 27712 1727096493.32711: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096493.32719: Calling all_plugins_play to load vars for managed_node2 27712 1727096493.32721: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096493.32724: Calling groups_plugins_play to load vars for managed_node2 27712 1727096493.33473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096493.34617: done with get_vars() 27712 1727096493.34640: done getting variables 27712 1727096493.34705: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 09:01:33 -0400 (0:00:00.038) 0:00:19.040 ****** 27712 1727096493.34737: entering _queue_task() for managed_node2/debug 27712 1727096493.35050: worker is 1 (out of 1 available) 27712 1727096493.35063: exiting _queue_task() for managed_node2/debug 27712 1727096493.35079: done queuing things up, now waiting for results queue to drain 27712 1727096493.35080: waiting for pending results... 27712 1727096493.35271: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27712 1727096493.35363: in run() - task 0afff68d-5257-cbc7-8716-00000000002d 27712 1727096493.35380: variable 'ansible_search_path' from source: unknown 27712 1727096493.35384: variable 'ansible_search_path' from source: unknown 27712 1727096493.35413: calling self._execute() 27712 1727096493.35484: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096493.35488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096493.35498: variable 'omit' from source: magic vars 27712 1727096493.35770: variable 'ansible_distribution_major_version' from source: facts 27712 1727096493.35782: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096493.35788: variable 'omit' from source: magic vars 27712 1727096493.35827: variable 'omit' from source: magic vars 27712 1727096493.35853: variable 'omit' from source: magic vars 27712 1727096493.35888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096493.35914: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096493.35932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096493.35945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096493.35954: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096493.35983: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096493.35986: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096493.35990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096493.36054: Set connection var ansible_connection to ssh 27712 1727096493.36060: Set connection var ansible_pipelining to False 27712 1727096493.36065: Set connection var ansible_timeout to 10 27712 1727096493.36069: Set connection var ansible_shell_type to sh 27712 1727096493.36080: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096493.36084: Set connection var ansible_shell_executable to /bin/sh 27712 1727096493.36105: variable 'ansible_shell_executable' from source: unknown 27712 1727096493.36108: variable 'ansible_connection' from source: unknown 27712 1727096493.36111: variable 'ansible_module_compression' from source: unknown 27712 1727096493.36113: variable 'ansible_shell_type' from source: unknown 27712 1727096493.36115: variable 'ansible_shell_executable' from source: unknown 27712 1727096493.36117: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096493.36119: variable 'ansible_pipelining' from source: unknown 27712 1727096493.36121: variable 'ansible_timeout' from source: unknown 27712 1727096493.36123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096493.36224: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096493.36233: variable 'omit' from source: magic vars 27712 1727096493.36238: starting attempt loop 27712 1727096493.36241: running the handler 27712 1727096493.36340: variable '__network_connections_result' from source: set_fact 27712 1727096493.36390: handler run complete 27712 1727096493.36406: attempt loop complete, returning result 27712 1727096493.36409: _execute() done 27712 1727096493.36412: dumping result to json 27712 1727096493.36414: done dumping result, returning 27712 1727096493.36423: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-cbc7-8716-00000000002d] 27712 1727096493.36427: sending task result for task 0afff68d-5257-cbc7-8716-00000000002d 27712 1727096493.36508: done sending task result for task 0afff68d-5257-cbc7-8716-00000000002d 27712 1727096493.36511: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 6432d2d2-f377-4efe-a5e3-d4d7172e455e", "[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa", "[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 6432d2d2-f377-4efe-a5e3-d4d7172e455e (not-active)", "[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa (not-active)" ] } 27712 1727096493.36573: no more pending results, returning what we have 27712 1727096493.36576: results queue empty 27712 1727096493.36578: checking for any_errors_fatal 27712 1727096493.36583: done checking for any_errors_fatal 27712 1727096493.36583: checking for max_fail_percentage 27712 1727096493.36585: done checking for max_fail_percentage 27712 1727096493.36586: checking to see if all hosts have failed and the running result is not ok 27712 1727096493.36586: done checking to see if all hosts have failed 27712 1727096493.36587: getting the remaining hosts for this loop 27712 1727096493.36588: done getting the remaining hosts for this loop 27712 1727096493.36592: getting the next task for host managed_node2 27712 1727096493.36599: done getting next task for host managed_node2 27712 1727096493.36602: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27712 1727096493.36605: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096493.36615: getting variables 27712 1727096493.36617: in VariableManager get_vars() 27712 1727096493.36655: Calling all_inventory to load vars for managed_node2 27712 1727096493.36658: Calling groups_inventory to load vars for managed_node2 27712 1727096493.36660: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096493.36675: Calling all_plugins_play to load vars for managed_node2 27712 1727096493.36678: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096493.36681: Calling groups_plugins_play to load vars for managed_node2 27712 1727096493.37915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096493.39475: done with get_vars() 27712 1727096493.39498: done getting variables 27712 1727096493.39552: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 09:01:33 -0400 (0:00:00.048) 0:00:19.089 ****** 27712 1727096493.39593: entering _queue_task() for managed_node2/debug 27712 1727096493.39889: worker is 1 (out of 1 available) 27712 1727096493.39900: exiting _queue_task() for managed_node2/debug 27712 1727096493.39912: done queuing things up, now waiting for results queue to drain 27712 1727096493.39913: waiting for pending results... 27712 1727096493.40388: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27712 1727096493.40393: in run() - task 0afff68d-5257-cbc7-8716-00000000002e 27712 1727096493.40396: variable 'ansible_search_path' from source: unknown 27712 1727096493.40398: variable 'ansible_search_path' from source: unknown 27712 1727096493.40401: calling self._execute() 27712 1727096493.40486: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096493.40498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096493.40515: variable 'omit' from source: magic vars 27712 1727096493.40896: variable 'ansible_distribution_major_version' from source: facts 27712 1727096493.40913: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096493.40924: variable 'omit' from source: magic vars 27712 1727096493.40992: variable 'omit' from source: magic vars 27712 1727096493.41032: variable 'omit' from source: magic vars 27712 1727096493.41085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096493.41124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096493.41146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096493.41173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096493.41188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096493.41218: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096493.41225: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096493.41230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096493.41327: Set connection var ansible_connection to ssh 27712 1727096493.41384: Set connection var ansible_pipelining to False 27712 1727096493.41388: Set connection var ansible_timeout to 10 27712 1727096493.41390: Set connection var ansible_shell_type to sh 27712 1727096493.41392: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096493.41394: Set connection var ansible_shell_executable to /bin/sh 27712 1727096493.41404: variable 'ansible_shell_executable' from source: unknown 27712 1727096493.41411: variable 'ansible_connection' from source: unknown 27712 1727096493.41418: variable 'ansible_module_compression' from source: unknown 27712 1727096493.41424: variable 'ansible_shell_type' from source: unknown 27712 1727096493.41430: variable 'ansible_shell_executable' from source: unknown 27712 1727096493.41436: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096493.41443: variable 'ansible_pipelining' from source: unknown 27712 1727096493.41450: variable 'ansible_timeout' from source: unknown 27712 1727096493.41493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096493.41621: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096493.41638: variable 'omit' from source: magic vars 27712 1727096493.41649: starting attempt loop 27712 1727096493.41655: running the handler 27712 1727096493.41722: variable '__network_connections_result' from source: set_fact 27712 1727096493.41812: variable '__network_connections_result' from source: set_fact 27712 1727096493.42031: handler run complete 27712 1727096493.42081: attempt loop complete, returning result 27712 1727096493.42094: _execute() done 27712 1727096493.42102: dumping result to json 27712 1727096493.42113: done dumping result, returning 27712 1727096493.42126: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-cbc7-8716-00000000002e] 27712 1727096493.42136: sending task result for task 0afff68d-5257-cbc7-8716-00000000002e 27712 1727096493.42476: done sending task result for task 0afff68d-5257-cbc7-8716-00000000002e 27712 1727096493.42480: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/24", "2001:db8::2/32" ], "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.10.64", "prefix": 26 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db6::4", "prefix": 128 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" }, { "autoconnect": false, "interface_name": "ethtest1", "ip": { "address": [ "198.51.100.6/24", "2001:db8::4/32" ], "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.12.128", "prefix": 26 } ] }, "name": "ethtest1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 6432d2d2-f377-4efe-a5e3-d4d7172e455e\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 6432d2d2-f377-4efe-a5e3-d4d7172e455e (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa (not-active)\n", "stderr_lines": [ "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 6432d2d2-f377-4efe-a5e3-d4d7172e455e", "[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa", "[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 6432d2d2-f377-4efe-a5e3-d4d7172e455e (not-active)", "[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa (not-active)" ] } } 27712 1727096493.42603: no more pending results, returning what we have 27712 1727096493.42607: results queue empty 27712 1727096493.42608: checking for any_errors_fatal 27712 1727096493.42615: done checking for any_errors_fatal 27712 1727096493.42616: checking for max_fail_percentage 27712 1727096493.42617: done checking for max_fail_percentage 27712 1727096493.42618: checking to see if all hosts have failed and the running result is not ok 27712 1727096493.42619: done checking to see if all hosts have failed 27712 1727096493.42620: getting the remaining hosts for this loop 27712 1727096493.42621: done getting the remaining hosts for this loop 27712 1727096493.42625: getting the next task for host managed_node2 27712 1727096493.42631: done getting next task for host managed_node2 27712 1727096493.42635: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27712 1727096493.42638: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096493.42649: getting variables 27712 1727096493.42651: in VariableManager get_vars() 27712 1727096493.42695: Calling all_inventory to load vars for managed_node2 27712 1727096493.42698: Calling groups_inventory to load vars for managed_node2 27712 1727096493.42701: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096493.42711: Calling all_plugins_play to load vars for managed_node2 27712 1727096493.42714: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096493.42717: Calling groups_plugins_play to load vars for managed_node2 27712 1727096493.44149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096493.45740: done with get_vars() 27712 1727096493.45774: done getting variables 27712 1727096493.45838: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 09:01:33 -0400 (0:00:00.062) 0:00:19.152 ****** 27712 1727096493.45881: entering _queue_task() for managed_node2/debug 27712 1727096493.46243: worker is 1 (out of 1 available) 27712 1727096493.46259: exiting _queue_task() for managed_node2/debug 27712 1727096493.46276: done queuing things up, now waiting for results queue to drain 27712 1727096493.46277: waiting for pending results... 27712 1727096493.46582: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27712 1727096493.46731: in run() - task 0afff68d-5257-cbc7-8716-00000000002f 27712 1727096493.46752: variable 'ansible_search_path' from source: unknown 27712 1727096493.46759: variable 'ansible_search_path' from source: unknown 27712 1727096493.46805: calling self._execute() 27712 1727096493.46902: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096493.47028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096493.47031: variable 'omit' from source: magic vars 27712 1727096493.47301: variable 'ansible_distribution_major_version' from source: facts 27712 1727096493.47317: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096493.47435: variable 'network_state' from source: role '' defaults 27712 1727096493.47452: Evaluated conditional (network_state != {}): False 27712 1727096493.47463: when evaluation is False, skipping this task 27712 1727096493.47474: _execute() done 27712 1727096493.47482: dumping result to json 27712 1727096493.47488: done dumping result, returning 27712 1727096493.47498: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-cbc7-8716-00000000002f] 27712 1727096493.47506: sending task result for task 0afff68d-5257-cbc7-8716-00000000002f 27712 1727096493.47775: done sending task result for task 0afff68d-5257-cbc7-8716-00000000002f 27712 1727096493.47780: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 27712 1727096493.47828: no more pending results, returning what we have 27712 1727096493.47832: results queue empty 27712 1727096493.47833: checking for any_errors_fatal 27712 1727096493.47847: done checking for any_errors_fatal 27712 1727096493.47848: checking for max_fail_percentage 27712 1727096493.47850: done checking for max_fail_percentage 27712 1727096493.47851: checking to see if all hosts have failed and the running result is not ok 27712 1727096493.47852: done checking to see if all hosts have failed 27712 1727096493.47852: getting the remaining hosts for this loop 27712 1727096493.47854: done getting the remaining hosts for this loop 27712 1727096493.47857: getting the next task for host managed_node2 27712 1727096493.47865: done getting next task for host managed_node2 27712 1727096493.47873: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 27712 1727096493.47877: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096493.47893: getting variables 27712 1727096493.47895: in VariableManager get_vars() 27712 1727096493.47939: Calling all_inventory to load vars for managed_node2 27712 1727096493.47942: Calling groups_inventory to load vars for managed_node2 27712 1727096493.47945: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096493.47955: Calling all_plugins_play to load vars for managed_node2 27712 1727096493.47958: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096493.47961: Calling groups_plugins_play to load vars for managed_node2 27712 1727096493.49473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096493.51112: done with get_vars() 27712 1727096493.51139: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 09:01:33 -0400 (0:00:00.053) 0:00:19.205 ****** 27712 1727096493.51243: entering _queue_task() for managed_node2/ping 27712 1727096493.51245: Creating lock for ping 27712 1727096493.51605: worker is 1 (out of 1 available) 27712 1727096493.51616: exiting _queue_task() for managed_node2/ping 27712 1727096493.51628: done queuing things up, now waiting for results queue to drain 27712 1727096493.51630: waiting for pending results... 27712 1727096493.51925: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 27712 1727096493.52073: in run() - task 0afff68d-5257-cbc7-8716-000000000030 27712 1727096493.52100: variable 'ansible_search_path' from source: unknown 27712 1727096493.52108: variable 'ansible_search_path' from source: unknown 27712 1727096493.52150: calling self._execute() 27712 1727096493.52255: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096493.52267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096493.52287: variable 'omit' from source: magic vars 27712 1727096493.52663: variable 'ansible_distribution_major_version' from source: facts 27712 1727096493.52685: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096493.52696: variable 'omit' from source: magic vars 27712 1727096493.52759: variable 'omit' from source: magic vars 27712 1727096493.52802: variable 'omit' from source: magic vars 27712 1727096493.52846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096493.53075: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096493.53079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096493.53081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096493.53084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096493.53086: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096493.53088: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096493.53090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096493.53098: Set connection var ansible_connection to ssh 27712 1727096493.53112: Set connection var ansible_pipelining to False 27712 1727096493.53123: Set connection var ansible_timeout to 10 27712 1727096493.53129: Set connection var ansible_shell_type to sh 27712 1727096493.53141: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096493.53150: Set connection var ansible_shell_executable to /bin/sh 27712 1727096493.53179: variable 'ansible_shell_executable' from source: unknown 27712 1727096493.53187: variable 'ansible_connection' from source: unknown 27712 1727096493.53195: variable 'ansible_module_compression' from source: unknown 27712 1727096493.53206: variable 'ansible_shell_type' from source: unknown 27712 1727096493.53214: variable 'ansible_shell_executable' from source: unknown 27712 1727096493.53221: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096493.53229: variable 'ansible_pipelining' from source: unknown 27712 1727096493.53236: variable 'ansible_timeout' from source: unknown 27712 1727096493.53243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096493.53454: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096493.53475: variable 'omit' from source: magic vars 27712 1727096493.53487: starting attempt loop 27712 1727096493.53493: running the handler 27712 1727096493.53510: _low_level_execute_command(): starting 27712 1727096493.53520: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096493.54238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096493.54252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096493.54283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096493.54303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096493.54403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096493.54428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096493.54537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096493.56229: stdout chunk (state=3): >>>/root <<< 27712 1727096493.56405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096493.56410: stdout chunk (state=3): >>><<< 27712 1727096493.56412: stderr chunk (state=3): >>><<< 27712 1727096493.56435: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096493.56546: _low_level_execute_command(): starting 27712 1727096493.56551: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096493.5644338-28690-119671439073943 `" && echo ansible-tmp-1727096493.5644338-28690-119671439073943="` echo /root/.ansible/tmp/ansible-tmp-1727096493.5644338-28690-119671439073943 `" ) && sleep 0' 27712 1727096493.57060: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096493.57063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096493.57066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096493.57078: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096493.57081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096493.57119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096493.57130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096493.57155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096493.57224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096493.59186: stdout chunk (state=3): >>>ansible-tmp-1727096493.5644338-28690-119671439073943=/root/.ansible/tmp/ansible-tmp-1727096493.5644338-28690-119671439073943 <<< 27712 1727096493.59375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096493.59378: stdout chunk (state=3): >>><<< 27712 1727096493.59380: stderr chunk (state=3): >>><<< 27712 1727096493.59383: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096493.5644338-28690-119671439073943=/root/.ansible/tmp/ansible-tmp-1727096493.5644338-28690-119671439073943 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096493.59427: variable 'ansible_module_compression' from source: unknown 27712 1727096493.59481: ANSIBALLZ: Using lock for ping 27712 1727096493.59489: ANSIBALLZ: Acquiring lock 27712 1727096493.59495: ANSIBALLZ: Lock acquired: 140297909117744 27712 1727096493.59502: ANSIBALLZ: Creating module 27712 1727096493.72078: ANSIBALLZ: Writing module into payload 27712 1727096493.72157: ANSIBALLZ: Writing module 27712 1727096493.72189: ANSIBALLZ: Renaming module 27712 1727096493.72203: ANSIBALLZ: Done creating module 27712 1727096493.72233: variable 'ansible_facts' from source: unknown 27712 1727096493.72312: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096493.5644338-28690-119671439073943/AnsiballZ_ping.py 27712 1727096493.72549: Sending initial data 27712 1727096493.72552: Sent initial data (153 bytes) 27712 1727096493.73184: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096493.73242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096493.73259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096493.73288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096493.73366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096493.75242: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096493.75246: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096493.75291: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmp5vfewiel /root/.ansible/tmp/ansible-tmp-1727096493.5644338-28690-119671439073943/AnsiballZ_ping.py <<< 27712 1727096493.75296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096493.5644338-28690-119671439073943/AnsiballZ_ping.py" <<< 27712 1727096493.75356: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 27712 1727096493.75373: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmp5vfewiel" to remote "/root/.ansible/tmp/ansible-tmp-1727096493.5644338-28690-119671439073943/AnsiballZ_ping.py" <<< 27712 1727096493.75385: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096493.5644338-28690-119671439073943/AnsiballZ_ping.py" <<< 27712 1727096493.76066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096493.76185: stderr chunk (state=3): >>><<< 27712 1727096493.76188: stdout chunk (state=3): >>><<< 27712 1727096493.76200: done transferring module to remote 27712 1727096493.76214: _low_level_execute_command(): starting 27712 1727096493.76222: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096493.5644338-28690-119671439073943/ /root/.ansible/tmp/ansible-tmp-1727096493.5644338-28690-119671439073943/AnsiballZ_ping.py && sleep 0' 27712 1727096493.76863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096493.76879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096493.76892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096493.76958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096493.77014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096493.77030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096493.77062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096493.77118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096493.78964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096493.78969: stdout chunk (state=3): >>><<< 27712 1727096493.78972: stderr chunk (state=3): >>><<< 27712 1727096493.78987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096493.78994: _low_level_execute_command(): starting 27712 1727096493.79072: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096493.5644338-28690-119671439073943/AnsiballZ_ping.py && sleep 0' 27712 1727096493.79683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096493.79724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096493.79743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096493.79769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096493.79845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096493.95050: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 27712 1727096493.96476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096493.96480: stdout chunk (state=3): >>><<< 27712 1727096493.96482: stderr chunk (state=3): >>><<< 27712 1727096493.96506: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096493.96608: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096493.5644338-28690-119671439073943/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096493.96612: _low_level_execute_command(): starting 27712 1727096493.96615: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096493.5644338-28690-119671439073943/ > /dev/null 2>&1 && sleep 0' 27712 1727096493.97218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096493.97244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096493.97270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096493.97300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096493.97346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096493.97437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096493.97463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096493.97484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096493.97504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096493.97571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096493.99488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096493.99511: stderr chunk (state=3): >>><<< 27712 1727096493.99572: stdout chunk (state=3): >>><<< 27712 1727096493.99576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096493.99579: handler run complete 27712 1727096493.99581: attempt loop complete, returning result 27712 1727096493.99583: _execute() done 27712 1727096493.99585: dumping result to json 27712 1727096493.99587: done dumping result, returning 27712 1727096493.99589: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-cbc7-8716-000000000030] 27712 1727096493.99591: sending task result for task 0afff68d-5257-cbc7-8716-000000000030 27712 1727096493.99682: done sending task result for task 0afff68d-5257-cbc7-8716-000000000030 27712 1727096493.99685: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 27712 1727096493.99828: no more pending results, returning what we have 27712 1727096493.99832: results queue empty 27712 1727096493.99833: checking for any_errors_fatal 27712 1727096493.99840: done checking for any_errors_fatal 27712 1727096493.99841: checking for max_fail_percentage 27712 1727096493.99843: done checking for max_fail_percentage 27712 1727096493.99843: checking to see if all hosts have failed and the running result is not ok 27712 1727096493.99844: done checking to see if all hosts have failed 27712 1727096493.99845: getting the remaining hosts for this loop 27712 1727096493.99848: done getting the remaining hosts for this loop 27712 1727096493.99853: getting the next task for host managed_node2 27712 1727096493.99863: done getting next task for host managed_node2 27712 1727096493.99866: ^ task is: TASK: meta (role_complete) 27712 1727096493.99877: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096493.99889: getting variables 27712 1727096493.99891: in VariableManager get_vars() 27712 1727096493.99935: Calling all_inventory to load vars for managed_node2 27712 1727096493.99938: Calling groups_inventory to load vars for managed_node2 27712 1727096493.99941: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096493.99951: Calling all_plugins_play to load vars for managed_node2 27712 1727096493.99955: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096493.99957: Calling groups_plugins_play to load vars for managed_node2 27712 1727096494.01681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096494.03415: done with get_vars() 27712 1727096494.03438: done getting variables 27712 1727096494.03539: done queuing things up, now waiting for results queue to drain 27712 1727096494.03541: results queue empty 27712 1727096494.03542: checking for any_errors_fatal 27712 1727096494.03545: done checking for any_errors_fatal 27712 1727096494.03546: checking for max_fail_percentage 27712 1727096494.03547: done checking for max_fail_percentage 27712 1727096494.03547: checking to see if all hosts have failed and the running result is not ok 27712 1727096494.03548: done checking to see if all hosts have failed 27712 1727096494.03549: getting the remaining hosts for this loop 27712 1727096494.03550: done getting the remaining hosts for this loop 27712 1727096494.03552: getting the next task for host managed_node2 27712 1727096494.03557: done getting next task for host managed_node2 27712 1727096494.03559: ^ task is: TASK: Get the IPv4 routes from the route table main 27712 1727096494.03561: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096494.03563: getting variables 27712 1727096494.03569: in VariableManager get_vars() 27712 1727096494.03586: Calling all_inventory to load vars for managed_node2 27712 1727096494.03589: Calling groups_inventory to load vars for managed_node2 27712 1727096494.03591: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096494.03596: Calling all_plugins_play to load vars for managed_node2 27712 1727096494.03598: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096494.03601: Calling groups_plugins_play to load vars for managed_node2 27712 1727096494.04776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096494.06356: done with get_vars() 27712 1727096494.06388: done getting variables 27712 1727096494.06431: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the IPv4 routes from the route table main] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:73 Monday 23 September 2024 09:01:34 -0400 (0:00:00.552) 0:00:19.757 ****** 27712 1727096494.06459: entering _queue_task() for managed_node2/command 27712 1727096494.07038: worker is 1 (out of 1 available) 27712 1727096494.07052: exiting _queue_task() for managed_node2/command 27712 1727096494.07065: done queuing things up, now waiting for results queue to drain 27712 1727096494.07066: waiting for pending results... 27712 1727096494.07680: running TaskExecutor() for managed_node2/TASK: Get the IPv4 routes from the route table main 27712 1727096494.07874: in run() - task 0afff68d-5257-cbc7-8716-000000000060 27712 1727096494.07926: variable 'ansible_search_path' from source: unknown 27712 1727096494.07931: calling self._execute() 27712 1727096494.08275: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.08278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.08281: variable 'omit' from source: magic vars 27712 1727096494.09066: variable 'ansible_distribution_major_version' from source: facts 27712 1727096494.09132: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096494.09146: variable 'omit' from source: magic vars 27712 1727096494.09200: variable 'omit' from source: magic vars 27712 1727096494.09278: variable 'omit' from source: magic vars 27712 1727096494.09391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096494.09489: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096494.09562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096494.09607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096494.09847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096494.09851: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096494.09853: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.09855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.09933: Set connection var ansible_connection to ssh 27712 1727096494.09974: Set connection var ansible_pipelining to False 27712 1727096494.09986: Set connection var ansible_timeout to 10 27712 1727096494.09999: Set connection var ansible_shell_type to sh 27712 1727096494.10013: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096494.10024: Set connection var ansible_shell_executable to /bin/sh 27712 1727096494.10052: variable 'ansible_shell_executable' from source: unknown 27712 1727096494.10060: variable 'ansible_connection' from source: unknown 27712 1727096494.10069: variable 'ansible_module_compression' from source: unknown 27712 1727096494.10078: variable 'ansible_shell_type' from source: unknown 27712 1727096494.10084: variable 'ansible_shell_executable' from source: unknown 27712 1727096494.10091: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.10098: variable 'ansible_pipelining' from source: unknown 27712 1727096494.10109: variable 'ansible_timeout' from source: unknown 27712 1727096494.10116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.10258: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096494.10277: variable 'omit' from source: magic vars 27712 1727096494.10289: starting attempt loop 27712 1727096494.10295: running the handler 27712 1727096494.10315: _low_level_execute_command(): starting 27712 1727096494.10372: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096494.11013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096494.11030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096494.11097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096494.11108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.11224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096494.11243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096494.11270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096494.11345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096494.13048: stdout chunk (state=3): >>>/root <<< 27712 1727096494.13146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096494.13186: stderr chunk (state=3): >>><<< 27712 1727096494.13188: stdout chunk (state=3): >>><<< 27712 1727096494.13206: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096494.13237: _low_level_execute_command(): starting 27712 1727096494.13241: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096494.1320906-28706-265253327683829 `" && echo ansible-tmp-1727096494.1320906-28706-265253327683829="` echo /root/.ansible/tmp/ansible-tmp-1727096494.1320906-28706-265253327683829 `" ) && sleep 0' 27712 1727096494.13907: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096494.13911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096494.13913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096494.13916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.13927: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096494.13930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096494.13933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.14023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096494.15943: stdout chunk (state=3): >>>ansible-tmp-1727096494.1320906-28706-265253327683829=/root/.ansible/tmp/ansible-tmp-1727096494.1320906-28706-265253327683829 <<< 27712 1727096494.16054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096494.16084: stderr chunk (state=3): >>><<< 27712 1727096494.16087: stdout chunk (state=3): >>><<< 27712 1727096494.16103: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096494.1320906-28706-265253327683829=/root/.ansible/tmp/ansible-tmp-1727096494.1320906-28706-265253327683829 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096494.16131: variable 'ansible_module_compression' from source: unknown 27712 1727096494.16201: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096494.16215: variable 'ansible_facts' from source: unknown 27712 1727096494.16376: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096494.1320906-28706-265253327683829/AnsiballZ_command.py 27712 1727096494.16513: Sending initial data 27712 1727096494.16681: Sent initial data (156 bytes) 27712 1727096494.17038: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096494.17054: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096494.17080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096494.17173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096494.17194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096494.17254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096494.18845: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 27712 1727096494.18876: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096494.18902: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096494.18931: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmps2lwdx4l /root/.ansible/tmp/ansible-tmp-1727096494.1320906-28706-265253327683829/AnsiballZ_command.py <<< 27712 1727096494.18943: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096494.1320906-28706-265253327683829/AnsiballZ_command.py" <<< 27712 1727096494.18965: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmps2lwdx4l" to remote "/root/.ansible/tmp/ansible-tmp-1727096494.1320906-28706-265253327683829/AnsiballZ_command.py" <<< 27712 1727096494.18969: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096494.1320906-28706-265253327683829/AnsiballZ_command.py" <<< 27712 1727096494.19508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096494.19659: stderr chunk (state=3): >>><<< 27712 1727096494.19662: stdout chunk (state=3): >>><<< 27712 1727096494.19664: done transferring module to remote 27712 1727096494.19666: _low_level_execute_command(): starting 27712 1727096494.19670: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096494.1320906-28706-265253327683829/ /root/.ansible/tmp/ansible-tmp-1727096494.1320906-28706-265253327683829/AnsiballZ_command.py && sleep 0' 27712 1727096494.20194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096494.20236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.20277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096494.20296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096494.20352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096494.22137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096494.22159: stderr chunk (state=3): >>><<< 27712 1727096494.22163: stdout chunk (state=3): >>><<< 27712 1727096494.22181: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096494.22184: _low_level_execute_command(): starting 27712 1727096494.22187: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096494.1320906-28706-265253327683829/AnsiballZ_command.py && sleep 0' 27712 1727096494.22627: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096494.22631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.22635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096494.22638: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096494.22640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.22690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096494.22694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096494.22698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096494.22739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096494.38433: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.126 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.126 metric 100 \n198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 \n198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 \n198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 ", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "route"], "start": "2024-09-23 09:01:34.379196", "end": "2024-09-23 09:01:34.383148", "delta": "0:00:00.003952", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096494.40025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096494.40052: stderr chunk (state=3): >>><<< 27712 1727096494.40056: stdout chunk (state=3): >>><<< 27712 1727096494.40072: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.126 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.126 metric 100 \n198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 \n198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 \n198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 ", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "route"], "start": "2024-09-23 09:01:34.379196", "end": "2024-09-23 09:01:34.383148", "delta": "0:00:00.003952", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096494.40109: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096494.1320906-28706-265253327683829/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096494.40116: _low_level_execute_command(): starting 27712 1727096494.40121: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096494.1320906-28706-265253327683829/ > /dev/null 2>&1 && sleep 0' 27712 1727096494.40584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096494.40587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.40590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096494.40592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096494.40594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.40642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096494.40645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096494.40653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096494.40689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096494.42512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096494.42540: stderr chunk (state=3): >>><<< 27712 1727096494.42543: stdout chunk (state=3): >>><<< 27712 1727096494.42556: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096494.42562: handler run complete 27712 1727096494.42587: Evaluated conditional (False): False 27712 1727096494.42596: attempt loop complete, returning result 27712 1727096494.42598: _execute() done 27712 1727096494.42601: dumping result to json 27712 1727096494.42606: done dumping result, returning 27712 1727096494.42614: done running TaskExecutor() for managed_node2/TASK: Get the IPv4 routes from the route table main [0afff68d-5257-cbc7-8716-000000000060] 27712 1727096494.42618: sending task result for task 0afff68d-5257-cbc7-8716-000000000060 27712 1727096494.42717: done sending task result for task 0afff68d-5257-cbc7-8716-000000000060 27712 1727096494.42720: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "-4", "route" ], "delta": "0:00:00.003952", "end": "2024-09-23 09:01:34.383148", "rc": 0, "start": "2024-09-23 09:01:34.379196" } STDOUT: default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.126 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.126 metric 100 198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 27712 1727096494.42801: no more pending results, returning what we have 27712 1727096494.42804: results queue empty 27712 1727096494.42805: checking for any_errors_fatal 27712 1727096494.42807: done checking for any_errors_fatal 27712 1727096494.42808: checking for max_fail_percentage 27712 1727096494.42809: done checking for max_fail_percentage 27712 1727096494.42810: checking to see if all hosts have failed and the running result is not ok 27712 1727096494.42811: done checking to see if all hosts have failed 27712 1727096494.42812: getting the remaining hosts for this loop 27712 1727096494.42813: done getting the remaining hosts for this loop 27712 1727096494.42816: getting the next task for host managed_node2 27712 1727096494.42823: done getting next task for host managed_node2 27712 1727096494.42825: ^ task is: TASK: Assert that the route table main contains the specified IPv4 routes 27712 1727096494.42827: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096494.42838: getting variables 27712 1727096494.42839: in VariableManager get_vars() 27712 1727096494.42879: Calling all_inventory to load vars for managed_node2 27712 1727096494.42882: Calling groups_inventory to load vars for managed_node2 27712 1727096494.42885: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096494.42895: Calling all_plugins_play to load vars for managed_node2 27712 1727096494.42897: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096494.42900: Calling groups_plugins_play to load vars for managed_node2 27712 1727096494.46934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096494.47766: done with get_vars() 27712 1727096494.47786: done getting variables 27712 1727096494.47822: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the route table main contains the specified IPv4 routes] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:78 Monday 23 September 2024 09:01:34 -0400 (0:00:00.413) 0:00:20.171 ****** 27712 1727096494.47839: entering _queue_task() for managed_node2/assert 27712 1727096494.48103: worker is 1 (out of 1 available) 27712 1727096494.48118: exiting _queue_task() for managed_node2/assert 27712 1727096494.48130: done queuing things up, now waiting for results queue to drain 27712 1727096494.48131: waiting for pending results... 27712 1727096494.48315: running TaskExecutor() for managed_node2/TASK: Assert that the route table main contains the specified IPv4 routes 27712 1727096494.48384: in run() - task 0afff68d-5257-cbc7-8716-000000000061 27712 1727096494.48395: variable 'ansible_search_path' from source: unknown 27712 1727096494.48423: calling self._execute() 27712 1727096494.48503: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.48508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.48517: variable 'omit' from source: magic vars 27712 1727096494.48808: variable 'ansible_distribution_major_version' from source: facts 27712 1727096494.48818: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096494.48824: variable 'omit' from source: magic vars 27712 1727096494.48840: variable 'omit' from source: magic vars 27712 1727096494.48867: variable 'omit' from source: magic vars 27712 1727096494.48908: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096494.48929: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096494.48945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096494.48958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096494.48969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096494.48994: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096494.48998: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.49001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.49070: Set connection var ansible_connection to ssh 27712 1727096494.49079: Set connection var ansible_pipelining to False 27712 1727096494.49085: Set connection var ansible_timeout to 10 27712 1727096494.49087: Set connection var ansible_shell_type to sh 27712 1727096494.49094: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096494.49099: Set connection var ansible_shell_executable to /bin/sh 27712 1727096494.49116: variable 'ansible_shell_executable' from source: unknown 27712 1727096494.49119: variable 'ansible_connection' from source: unknown 27712 1727096494.49122: variable 'ansible_module_compression' from source: unknown 27712 1727096494.49125: variable 'ansible_shell_type' from source: unknown 27712 1727096494.49127: variable 'ansible_shell_executable' from source: unknown 27712 1727096494.49129: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.49132: variable 'ansible_pipelining' from source: unknown 27712 1727096494.49135: variable 'ansible_timeout' from source: unknown 27712 1727096494.49138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.49239: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096494.49249: variable 'omit' from source: magic vars 27712 1727096494.49252: starting attempt loop 27712 1727096494.49255: running the handler 27712 1727096494.49370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096494.49542: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096494.49600: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096494.49626: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096494.49651: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096494.49720: variable 'route_table_main_ipv4' from source: set_fact 27712 1727096494.49744: Evaluated conditional (route_table_main_ipv4.stdout is search("198.51.10.64/26 via 198.51.100.6 dev ethtest0\s+(proto static )?metric 4")): True 27712 1727096494.49841: variable 'route_table_main_ipv4' from source: set_fact 27712 1727096494.49863: Evaluated conditional (route_table_main_ipv4.stdout is search("198.51.12.128/26 via 198.51.100.1 dev ethtest1\s+(proto static )?metric 2")): True 27712 1727096494.49869: handler run complete 27712 1727096494.49884: attempt loop complete, returning result 27712 1727096494.49887: _execute() done 27712 1727096494.49890: dumping result to json 27712 1727096494.49892: done dumping result, returning 27712 1727096494.49899: done running TaskExecutor() for managed_node2/TASK: Assert that the route table main contains the specified IPv4 routes [0afff68d-5257-cbc7-8716-000000000061] 27712 1727096494.49903: sending task result for task 0afff68d-5257-cbc7-8716-000000000061 27712 1727096494.49991: done sending task result for task 0afff68d-5257-cbc7-8716-000000000061 27712 1727096494.49994: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27712 1727096494.50075: no more pending results, returning what we have 27712 1727096494.50077: results queue empty 27712 1727096494.50078: checking for any_errors_fatal 27712 1727096494.50087: done checking for any_errors_fatal 27712 1727096494.50087: checking for max_fail_percentage 27712 1727096494.50089: done checking for max_fail_percentage 27712 1727096494.50090: checking to see if all hosts have failed and the running result is not ok 27712 1727096494.50091: done checking to see if all hosts have failed 27712 1727096494.50091: getting the remaining hosts for this loop 27712 1727096494.50092: done getting the remaining hosts for this loop 27712 1727096494.50095: getting the next task for host managed_node2 27712 1727096494.50101: done getting next task for host managed_node2 27712 1727096494.50103: ^ task is: TASK: Get the IPv6 routes from the route table main 27712 1727096494.50105: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096494.50107: getting variables 27712 1727096494.50110: in VariableManager get_vars() 27712 1727096494.50145: Calling all_inventory to load vars for managed_node2 27712 1727096494.50148: Calling groups_inventory to load vars for managed_node2 27712 1727096494.50150: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096494.50158: Calling all_plugins_play to load vars for managed_node2 27712 1727096494.50161: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096494.50163: Calling groups_plugins_play to load vars for managed_node2 27712 1727096494.50932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096494.51791: done with get_vars() 27712 1727096494.51809: done getting variables 27712 1727096494.51849: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the IPv6 routes from the route table main] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:89 Monday 23 September 2024 09:01:34 -0400 (0:00:00.040) 0:00:20.212 ****** 27712 1727096494.51876: entering _queue_task() for managed_node2/command 27712 1727096494.52104: worker is 1 (out of 1 available) 27712 1727096494.52116: exiting _queue_task() for managed_node2/command 27712 1727096494.52127: done queuing things up, now waiting for results queue to drain 27712 1727096494.52129: waiting for pending results... 27712 1727096494.52305: running TaskExecutor() for managed_node2/TASK: Get the IPv6 routes from the route table main 27712 1727096494.52370: in run() - task 0afff68d-5257-cbc7-8716-000000000062 27712 1727096494.52384: variable 'ansible_search_path' from source: unknown 27712 1727096494.52412: calling self._execute() 27712 1727096494.52490: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.52495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.52503: variable 'omit' from source: magic vars 27712 1727096494.52788: variable 'ansible_distribution_major_version' from source: facts 27712 1727096494.52800: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096494.52803: variable 'omit' from source: magic vars 27712 1727096494.52820: variable 'omit' from source: magic vars 27712 1727096494.52845: variable 'omit' from source: magic vars 27712 1727096494.52879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096494.52910: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096494.52925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096494.52938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096494.52947: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096494.52972: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096494.52977: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.52979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.53048: Set connection var ansible_connection to ssh 27712 1727096494.53055: Set connection var ansible_pipelining to False 27712 1727096494.53060: Set connection var ansible_timeout to 10 27712 1727096494.53062: Set connection var ansible_shell_type to sh 27712 1727096494.53070: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096494.53078: Set connection var ansible_shell_executable to /bin/sh 27712 1727096494.53094: variable 'ansible_shell_executable' from source: unknown 27712 1727096494.53097: variable 'ansible_connection' from source: unknown 27712 1727096494.53100: variable 'ansible_module_compression' from source: unknown 27712 1727096494.53103: variable 'ansible_shell_type' from source: unknown 27712 1727096494.53105: variable 'ansible_shell_executable' from source: unknown 27712 1727096494.53108: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.53110: variable 'ansible_pipelining' from source: unknown 27712 1727096494.53112: variable 'ansible_timeout' from source: unknown 27712 1727096494.53118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.53219: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096494.53229: variable 'omit' from source: magic vars 27712 1727096494.53232: starting attempt loop 27712 1727096494.53235: running the handler 27712 1727096494.53249: _low_level_execute_command(): starting 27712 1727096494.53256: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096494.53763: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096494.53771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.53775: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096494.53780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.53829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096494.53832: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096494.53835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096494.53883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096494.55562: stdout chunk (state=3): >>>/root <<< 27712 1727096494.55660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096494.55692: stderr chunk (state=3): >>><<< 27712 1727096494.55696: stdout chunk (state=3): >>><<< 27712 1727096494.55723: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096494.55734: _low_level_execute_command(): starting 27712 1727096494.55740: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096494.5571945-28726-106465089503399 `" && echo ansible-tmp-1727096494.5571945-28726-106465089503399="` echo /root/.ansible/tmp/ansible-tmp-1727096494.5571945-28726-106465089503399 `" ) && sleep 0' 27712 1727096494.56193: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096494.56196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.56199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096494.56209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.56250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096494.56257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096494.56260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096494.56293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096494.58215: stdout chunk (state=3): >>>ansible-tmp-1727096494.5571945-28726-106465089503399=/root/.ansible/tmp/ansible-tmp-1727096494.5571945-28726-106465089503399 <<< 27712 1727096494.58322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096494.58355: stderr chunk (state=3): >>><<< 27712 1727096494.58359: stdout chunk (state=3): >>><<< 27712 1727096494.58381: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096494.5571945-28726-106465089503399=/root/.ansible/tmp/ansible-tmp-1727096494.5571945-28726-106465089503399 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096494.58406: variable 'ansible_module_compression' from source: unknown 27712 1727096494.58446: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096494.58483: variable 'ansible_facts' from source: unknown 27712 1727096494.58537: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096494.5571945-28726-106465089503399/AnsiballZ_command.py 27712 1727096494.58639: Sending initial data 27712 1727096494.58642: Sent initial data (156 bytes) 27712 1727096494.59096: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096494.59099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096494.59102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.59104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096494.59108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096494.59110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.59158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096494.59161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096494.59165: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096494.59194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096494.60778: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 27712 1727096494.60788: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096494.60804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096494.60842: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmp3cknm4c8 /root/.ansible/tmp/ansible-tmp-1727096494.5571945-28726-106465089503399/AnsiballZ_command.py <<< 27712 1727096494.60851: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096494.5571945-28726-106465089503399/AnsiballZ_command.py" <<< 27712 1727096494.60874: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmp3cknm4c8" to remote "/root/.ansible/tmp/ansible-tmp-1727096494.5571945-28726-106465089503399/AnsiballZ_command.py" <<< 27712 1727096494.60876: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096494.5571945-28726-106465089503399/AnsiballZ_command.py" <<< 27712 1727096494.61354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096494.61401: stderr chunk (state=3): >>><<< 27712 1727096494.61405: stdout chunk (state=3): >>><<< 27712 1727096494.61444: done transferring module to remote 27712 1727096494.61454: _low_level_execute_command(): starting 27712 1727096494.61458: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096494.5571945-28726-106465089503399/ /root/.ansible/tmp/ansible-tmp-1727096494.5571945-28726-106465089503399/AnsiballZ_command.py && sleep 0' 27712 1727096494.61908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096494.61914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096494.61916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.61921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096494.61923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.61979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096494.61983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096494.61986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096494.62016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096494.63800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096494.63825: stderr chunk (state=3): >>><<< 27712 1727096494.63828: stdout chunk (state=3): >>><<< 27712 1727096494.63842: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096494.63845: _low_level_execute_command(): starting 27712 1727096494.63849: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096494.5571945-28726-106465089503399/AnsiballZ_command.py && sleep 0' 27712 1727096494.64275: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096494.64279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.64292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.64347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096494.64353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096494.64355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096494.64391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096494.80053: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium\n2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium\n2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium\nfe80::/64 dev peerethtest0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest1 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest1 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-23 09:01:34.795474", "end": "2024-09-23 09:01:34.799454", "delta": "0:00:00.003980", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096494.81703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096494.81730: stderr chunk (state=3): >>><<< 27712 1727096494.81733: stdout chunk (state=3): >>><<< 27712 1727096494.81750: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium\n2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium\n2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium\nfe80::/64 dev peerethtest0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest1 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest1 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-23 09:01:34.795474", "end": "2024-09-23 09:01:34.799454", "delta": "0:00:00.003980", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096494.81787: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096494.5571945-28726-106465089503399/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096494.81793: _low_level_execute_command(): starting 27712 1727096494.81798: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096494.5571945-28726-106465089503399/ > /dev/null 2>&1 && sleep 0' 27712 1727096494.82257: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096494.82261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.82263: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096494.82265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.82330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096494.82333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096494.82337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096494.82366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096494.84219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096494.84249: stderr chunk (state=3): >>><<< 27712 1727096494.84252: stdout chunk (state=3): >>><<< 27712 1727096494.84266: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096494.84276: handler run complete 27712 1727096494.84293: Evaluated conditional (False): False 27712 1727096494.84302: attempt loop complete, returning result 27712 1727096494.84306: _execute() done 27712 1727096494.84309: dumping result to json 27712 1727096494.84314: done dumping result, returning 27712 1727096494.84320: done running TaskExecutor() for managed_node2/TASK: Get the IPv6 routes from the route table main [0afff68d-5257-cbc7-8716-000000000062] 27712 1727096494.84324: sending task result for task 0afff68d-5257-cbc7-8716-000000000062 27712 1727096494.84421: done sending task result for task 0afff68d-5257-cbc7-8716-000000000062 27712 1727096494.84424: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003980", "end": "2024-09-23 09:01:34.799454", "rc": 0, "start": "2024-09-23 09:01:34.795474" } STDOUT: 2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium 2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium 2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium fe80::/64 dev peerethtest0 proto kernel metric 256 pref medium fe80::/64 dev peerethtest1 proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev ethtest0 proto kernel metric 1024 pref medium fe80::/64 dev ethtest1 proto kernel metric 1024 pref medium 27712 1727096494.84499: no more pending results, returning what we have 27712 1727096494.84502: results queue empty 27712 1727096494.84503: checking for any_errors_fatal 27712 1727096494.84510: done checking for any_errors_fatal 27712 1727096494.84510: checking for max_fail_percentage 27712 1727096494.84512: done checking for max_fail_percentage 27712 1727096494.84513: checking to see if all hosts have failed and the running result is not ok 27712 1727096494.84514: done checking to see if all hosts have failed 27712 1727096494.84514: getting the remaining hosts for this loop 27712 1727096494.84515: done getting the remaining hosts for this loop 27712 1727096494.84518: getting the next task for host managed_node2 27712 1727096494.84524: done getting next task for host managed_node2 27712 1727096494.84527: ^ task is: TASK: Assert that the route table main contains the specified IPv6 routes 27712 1727096494.84529: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096494.84539: getting variables 27712 1727096494.84540: in VariableManager get_vars() 27712 1727096494.84582: Calling all_inventory to load vars for managed_node2 27712 1727096494.84585: Calling groups_inventory to load vars for managed_node2 27712 1727096494.84587: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096494.84598: Calling all_plugins_play to load vars for managed_node2 27712 1727096494.84600: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096494.84603: Calling groups_plugins_play to load vars for managed_node2 27712 1727096494.85600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096494.86450: done with get_vars() 27712 1727096494.86466: done getting variables 27712 1727096494.86514: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the route table main contains the specified IPv6 routes] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:94 Monday 23 September 2024 09:01:34 -0400 (0:00:00.346) 0:00:20.558 ****** 27712 1727096494.86535: entering _queue_task() for managed_node2/assert 27712 1727096494.86788: worker is 1 (out of 1 available) 27712 1727096494.86801: exiting _queue_task() for managed_node2/assert 27712 1727096494.86814: done queuing things up, now waiting for results queue to drain 27712 1727096494.86816: waiting for pending results... 27712 1727096494.87203: running TaskExecutor() for managed_node2/TASK: Assert that the route table main contains the specified IPv6 routes 27712 1727096494.87208: in run() - task 0afff68d-5257-cbc7-8716-000000000063 27712 1727096494.87211: variable 'ansible_search_path' from source: unknown 27712 1727096494.87213: calling self._execute() 27712 1727096494.87311: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.87323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.87475: variable 'omit' from source: magic vars 27712 1727096494.87725: variable 'ansible_distribution_major_version' from source: facts 27712 1727096494.87742: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096494.87752: variable 'omit' from source: magic vars 27712 1727096494.87783: variable 'omit' from source: magic vars 27712 1727096494.87822: variable 'omit' from source: magic vars 27712 1727096494.87890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096494.87928: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096494.87955: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096494.87982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096494.87998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096494.88034: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096494.88043: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.88052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.88157: Set connection var ansible_connection to ssh 27712 1727096494.88176: Set connection var ansible_pipelining to False 27712 1727096494.88187: Set connection var ansible_timeout to 10 27712 1727096494.88194: Set connection var ansible_shell_type to sh 27712 1727096494.88205: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096494.88214: Set connection var ansible_shell_executable to /bin/sh 27712 1727096494.88238: variable 'ansible_shell_executable' from source: unknown 27712 1727096494.88251: variable 'ansible_connection' from source: unknown 27712 1727096494.88258: variable 'ansible_module_compression' from source: unknown 27712 1727096494.88265: variable 'ansible_shell_type' from source: unknown 27712 1727096494.88277: variable 'ansible_shell_executable' from source: unknown 27712 1727096494.88284: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.88293: variable 'ansible_pipelining' from source: unknown 27712 1727096494.88300: variable 'ansible_timeout' from source: unknown 27712 1727096494.88354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.88447: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096494.88469: variable 'omit' from source: magic vars 27712 1727096494.88483: starting attempt loop 27712 1727096494.88490: running the handler 27712 1727096494.88692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096494.88905: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096494.88938: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096494.89003: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096494.89024: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096494.89088: variable 'route_table_main_ipv6' from source: set_fact 27712 1727096494.89120: Evaluated conditional (route_table_main_ipv6.stdout is search("2001:db6::4 via 2001:db8::1 dev ethtest0\s+(proto static )?metric 2")): True 27712 1727096494.89123: handler run complete 27712 1727096494.89133: attempt loop complete, returning result 27712 1727096494.89136: _execute() done 27712 1727096494.89139: dumping result to json 27712 1727096494.89141: done dumping result, returning 27712 1727096494.89147: done running TaskExecutor() for managed_node2/TASK: Assert that the route table main contains the specified IPv6 routes [0afff68d-5257-cbc7-8716-000000000063] 27712 1727096494.89152: sending task result for task 0afff68d-5257-cbc7-8716-000000000063 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27712 1727096494.89292: no more pending results, returning what we have 27712 1727096494.89296: results queue empty 27712 1727096494.89296: checking for any_errors_fatal 27712 1727096494.89305: done checking for any_errors_fatal 27712 1727096494.89306: checking for max_fail_percentage 27712 1727096494.89308: done checking for max_fail_percentage 27712 1727096494.89308: checking to see if all hosts have failed and the running result is not ok 27712 1727096494.89309: done checking to see if all hosts have failed 27712 1727096494.89310: getting the remaining hosts for this loop 27712 1727096494.89311: done getting the remaining hosts for this loop 27712 1727096494.89314: getting the next task for host managed_node2 27712 1727096494.89320: done getting next task for host managed_node2 27712 1727096494.89324: ^ task is: TASK: Get the interface1 MAC address 27712 1727096494.89326: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096494.89329: getting variables 27712 1727096494.89331: in VariableManager get_vars() 27712 1727096494.89374: Calling all_inventory to load vars for managed_node2 27712 1727096494.89377: Calling groups_inventory to load vars for managed_node2 27712 1727096494.89379: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096494.89389: Calling all_plugins_play to load vars for managed_node2 27712 1727096494.89392: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096494.89394: Calling groups_plugins_play to load vars for managed_node2 27712 1727096494.89981: done sending task result for task 0afff68d-5257-cbc7-8716-000000000063 27712 1727096494.89984: WORKER PROCESS EXITING 27712 1727096494.90204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096494.91533: done with get_vars() 27712 1727096494.91557: done getting variables 27712 1727096494.91617: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the interface1 MAC address] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:99 Monday 23 September 2024 09:01:34 -0400 (0:00:00.051) 0:00:20.609 ****** 27712 1727096494.91644: entering _queue_task() for managed_node2/command 27712 1727096494.91978: worker is 1 (out of 1 available) 27712 1727096494.91990: exiting _queue_task() for managed_node2/command 27712 1727096494.92003: done queuing things up, now waiting for results queue to drain 27712 1727096494.92004: waiting for pending results... 27712 1727096494.92285: running TaskExecutor() for managed_node2/TASK: Get the interface1 MAC address 27712 1727096494.92412: in run() - task 0afff68d-5257-cbc7-8716-000000000064 27712 1727096494.92416: variable 'ansible_search_path' from source: unknown 27712 1727096494.92424: calling self._execute() 27712 1727096494.92535: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.92546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.92598: variable 'omit' from source: magic vars 27712 1727096494.92957: variable 'ansible_distribution_major_version' from source: facts 27712 1727096494.92979: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096494.92991: variable 'omit' from source: magic vars 27712 1727096494.93014: variable 'omit' from source: magic vars 27712 1727096494.93110: variable 'interface1' from source: play vars 27712 1727096494.93140: variable 'omit' from source: magic vars 27712 1727096494.93250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096494.93254: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096494.93256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096494.93264: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096494.93285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096494.93320: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096494.93328: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.93336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.93451: Set connection var ansible_connection to ssh 27712 1727096494.93473: Set connection var ansible_pipelining to False 27712 1727096494.93484: Set connection var ansible_timeout to 10 27712 1727096494.93490: Set connection var ansible_shell_type to sh 27712 1727096494.93501: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096494.93509: Set connection var ansible_shell_executable to /bin/sh 27712 1727096494.93532: variable 'ansible_shell_executable' from source: unknown 27712 1727096494.93576: variable 'ansible_connection' from source: unknown 27712 1727096494.93580: variable 'ansible_module_compression' from source: unknown 27712 1727096494.93582: variable 'ansible_shell_type' from source: unknown 27712 1727096494.93585: variable 'ansible_shell_executable' from source: unknown 27712 1727096494.93587: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096494.93589: variable 'ansible_pipelining' from source: unknown 27712 1727096494.93590: variable 'ansible_timeout' from source: unknown 27712 1727096494.93592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096494.93718: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096494.93794: variable 'omit' from source: magic vars 27712 1727096494.93797: starting attempt loop 27712 1727096494.93799: running the handler 27712 1727096494.93801: _low_level_execute_command(): starting 27712 1727096494.93803: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096494.94507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096494.94521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096494.94574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096494.94593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096494.94606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096494.94680: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.94709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096494.94726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096494.94748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096494.94817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096494.96533: stdout chunk (state=3): >>>/root <<< 27712 1727096494.96708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096494.96712: stdout chunk (state=3): >>><<< 27712 1727096494.96714: stderr chunk (state=3): >>><<< 27712 1727096494.96738: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096494.96843: _low_level_execute_command(): starting 27712 1727096494.96848: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096494.9674532-28736-237613785229882 `" && echo ansible-tmp-1727096494.9674532-28736-237613785229882="` echo /root/.ansible/tmp/ansible-tmp-1727096494.9674532-28736-237613785229882 `" ) && sleep 0' 27712 1727096494.97551: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096494.97567: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096494.97653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096494.97673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096494.97722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096494.97747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096494.97786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096494.97904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096494.99848: stdout chunk (state=3): >>>ansible-tmp-1727096494.9674532-28736-237613785229882=/root/.ansible/tmp/ansible-tmp-1727096494.9674532-28736-237613785229882 <<< 27712 1727096494.99956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096495.00016: stderr chunk (state=3): >>><<< 27712 1727096495.00019: stdout chunk (state=3): >>><<< 27712 1727096495.00037: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096494.9674532-28736-237613785229882=/root/.ansible/tmp/ansible-tmp-1727096494.9674532-28736-237613785229882 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096495.00174: variable 'ansible_module_compression' from source: unknown 27712 1727096495.00177: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096495.00179: variable 'ansible_facts' from source: unknown 27712 1727096495.00266: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096494.9674532-28736-237613785229882/AnsiballZ_command.py 27712 1727096495.00486: Sending initial data 27712 1727096495.00497: Sent initial data (156 bytes) 27712 1727096495.00981: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096495.00996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096495.01085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096495.01119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096495.01152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096495.01194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096495.02795: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096495.02837: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096495.02872: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpg4ukc9k7 /root/.ansible/tmp/ansible-tmp-1727096494.9674532-28736-237613785229882/AnsiballZ_command.py <<< 27712 1727096495.02877: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096494.9674532-28736-237613785229882/AnsiballZ_command.py" <<< 27712 1727096495.02901: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpg4ukc9k7" to remote "/root/.ansible/tmp/ansible-tmp-1727096494.9674532-28736-237613785229882/AnsiballZ_command.py" <<< 27712 1727096495.02903: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096494.9674532-28736-237613785229882/AnsiballZ_command.py" <<< 27712 1727096495.03676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096495.03680: stderr chunk (state=3): >>><<< 27712 1727096495.03682: stdout chunk (state=3): >>><<< 27712 1727096495.03684: done transferring module to remote 27712 1727096495.03686: _low_level_execute_command(): starting 27712 1727096495.03689: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096494.9674532-28736-237613785229882/ /root/.ansible/tmp/ansible-tmp-1727096494.9674532-28736-237613785229882/AnsiballZ_command.py && sleep 0' 27712 1727096495.04275: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096495.04313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096495.04384: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096495.04405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096495.04439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096495.04448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096495.04508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096495.06396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096495.06399: stdout chunk (state=3): >>><<< 27712 1727096495.06401: stderr chunk (state=3): >>><<< 27712 1727096495.06493: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096495.06496: _low_level_execute_command(): starting 27712 1727096495.06499: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096494.9674532-28736-237613785229882/AnsiballZ_command.py && sleep 0' 27712 1727096495.07078: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096495.07099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096495.07124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096495.07142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096495.07251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096495.07278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096495.07350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096495.23077: stdout chunk (state=3): >>> {"changed": true, "stdout": "da:ee:22:eb:6c:c1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/ethtest1/address"], "start": "2024-09-23 09:01:35.226527", "end": "2024-09-23 09:01:35.229729", "delta": "0:00:00.003202", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/ethtest1/address", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096495.24706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096495.24733: stderr chunk (state=3): >>><<< 27712 1727096495.24736: stdout chunk (state=3): >>><<< 27712 1727096495.24753: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "da:ee:22:eb:6c:c1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/ethtest1/address"], "start": "2024-09-23 09:01:35.226527", "end": "2024-09-23 09:01:35.229729", "delta": "0:00:00.003202", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/ethtest1/address", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096495.24785: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/ethtest1/address', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096494.9674532-28736-237613785229882/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096495.24792: _low_level_execute_command(): starting 27712 1727096495.24797: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096494.9674532-28736-237613785229882/ > /dev/null 2>&1 && sleep 0' 27712 1727096495.25263: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096495.25266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096495.25272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096495.25274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096495.25330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096495.25333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096495.25336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096495.25376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096495.27205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096495.27236: stderr chunk (state=3): >>><<< 27712 1727096495.27239: stdout chunk (state=3): >>><<< 27712 1727096495.27253: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096495.27259: handler run complete 27712 1727096495.27281: Evaluated conditional (False): False 27712 1727096495.27290: attempt loop complete, returning result 27712 1727096495.27293: _execute() done 27712 1727096495.27295: dumping result to json 27712 1727096495.27300: done dumping result, returning 27712 1727096495.27308: done running TaskExecutor() for managed_node2/TASK: Get the interface1 MAC address [0afff68d-5257-cbc7-8716-000000000064] 27712 1727096495.27312: sending task result for task 0afff68d-5257-cbc7-8716-000000000064 27712 1727096495.27410: done sending task result for task 0afff68d-5257-cbc7-8716-000000000064 27712 1727096495.27413: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/sys/class/net/ethtest1/address" ], "delta": "0:00:00.003202", "end": "2024-09-23 09:01:35.229729", "rc": 0, "start": "2024-09-23 09:01:35.226527" } STDOUT: da:ee:22:eb:6c:c1 27712 1727096495.27499: no more pending results, returning what we have 27712 1727096495.27502: results queue empty 27712 1727096495.27503: checking for any_errors_fatal 27712 1727096495.27509: done checking for any_errors_fatal 27712 1727096495.27509: checking for max_fail_percentage 27712 1727096495.27511: done checking for max_fail_percentage 27712 1727096495.27512: checking to see if all hosts have failed and the running result is not ok 27712 1727096495.27513: done checking to see if all hosts have failed 27712 1727096495.27513: getting the remaining hosts for this loop 27712 1727096495.27515: done getting the remaining hosts for this loop 27712 1727096495.27518: getting the next task for host managed_node2 27712 1727096495.27525: done getting next task for host managed_node2 27712 1727096495.27530: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27712 1727096495.27533: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096495.27552: getting variables 27712 1727096495.27553: in VariableManager get_vars() 27712 1727096495.27592: Calling all_inventory to load vars for managed_node2 27712 1727096495.27595: Calling groups_inventory to load vars for managed_node2 27712 1727096495.27597: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096495.27606: Calling all_plugins_play to load vars for managed_node2 27712 1727096495.27608: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096495.27611: Calling groups_plugins_play to load vars for managed_node2 27712 1727096495.28478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096495.29339: done with get_vars() 27712 1727096495.29355: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 09:01:35 -0400 (0:00:00.377) 0:00:20.987 ****** 27712 1727096495.29427: entering _queue_task() for managed_node2/include_tasks 27712 1727096495.29669: worker is 1 (out of 1 available) 27712 1727096495.29683: exiting _queue_task() for managed_node2/include_tasks 27712 1727096495.29696: done queuing things up, now waiting for results queue to drain 27712 1727096495.29697: waiting for pending results... 27712 1727096495.29871: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27712 1727096495.29972: in run() - task 0afff68d-5257-cbc7-8716-00000000006c 27712 1727096495.29988: variable 'ansible_search_path' from source: unknown 27712 1727096495.29991: variable 'ansible_search_path' from source: unknown 27712 1727096495.30018: calling self._execute() 27712 1727096495.30094: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096495.30098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096495.30107: variable 'omit' from source: magic vars 27712 1727096495.30388: variable 'ansible_distribution_major_version' from source: facts 27712 1727096495.30398: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096495.30404: _execute() done 27712 1727096495.30406: dumping result to json 27712 1727096495.30411: done dumping result, returning 27712 1727096495.30417: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-cbc7-8716-00000000006c] 27712 1727096495.30422: sending task result for task 0afff68d-5257-cbc7-8716-00000000006c 27712 1727096495.30505: done sending task result for task 0afff68d-5257-cbc7-8716-00000000006c 27712 1727096495.30507: WORKER PROCESS EXITING 27712 1727096495.30546: no more pending results, returning what we have 27712 1727096495.30551: in VariableManager get_vars() 27712 1727096495.30596: Calling all_inventory to load vars for managed_node2 27712 1727096495.30599: Calling groups_inventory to load vars for managed_node2 27712 1727096495.30601: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096495.30612: Calling all_plugins_play to load vars for managed_node2 27712 1727096495.30614: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096495.30617: Calling groups_plugins_play to load vars for managed_node2 27712 1727096495.31399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096495.32351: done with get_vars() 27712 1727096495.32364: variable 'ansible_search_path' from source: unknown 27712 1727096495.32365: variable 'ansible_search_path' from source: unknown 27712 1727096495.32394: we have included files to process 27712 1727096495.32395: generating all_blocks data 27712 1727096495.32397: done generating all_blocks data 27712 1727096495.32401: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27712 1727096495.32401: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27712 1727096495.32403: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27712 1727096495.32778: done processing included file 27712 1727096495.32780: iterating over new_blocks loaded from include file 27712 1727096495.32781: in VariableManager get_vars() 27712 1727096495.32796: done with get_vars() 27712 1727096495.32797: filtering new block on tags 27712 1727096495.32809: done filtering new block on tags 27712 1727096495.32811: in VariableManager get_vars() 27712 1727096495.32824: done with get_vars() 27712 1727096495.32825: filtering new block on tags 27712 1727096495.32836: done filtering new block on tags 27712 1727096495.32838: in VariableManager get_vars() 27712 1727096495.32853: done with get_vars() 27712 1727096495.32855: filtering new block on tags 27712 1727096495.32865: done filtering new block on tags 27712 1727096495.32866: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 27712 1727096495.32873: extending task lists for all hosts with included blocks 27712 1727096495.33314: done extending task lists 27712 1727096495.33315: done processing included files 27712 1727096495.33316: results queue empty 27712 1727096495.33316: checking for any_errors_fatal 27712 1727096495.33319: done checking for any_errors_fatal 27712 1727096495.33320: checking for max_fail_percentage 27712 1727096495.33320: done checking for max_fail_percentage 27712 1727096495.33321: checking to see if all hosts have failed and the running result is not ok 27712 1727096495.33321: done checking to see if all hosts have failed 27712 1727096495.33322: getting the remaining hosts for this loop 27712 1727096495.33322: done getting the remaining hosts for this loop 27712 1727096495.33324: getting the next task for host managed_node2 27712 1727096495.33326: done getting next task for host managed_node2 27712 1727096495.33328: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27712 1727096495.33330: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096495.33336: getting variables 27712 1727096495.33337: in VariableManager get_vars() 27712 1727096495.33347: Calling all_inventory to load vars for managed_node2 27712 1727096495.33348: Calling groups_inventory to load vars for managed_node2 27712 1727096495.33350: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096495.33353: Calling all_plugins_play to load vars for managed_node2 27712 1727096495.33354: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096495.33356: Calling groups_plugins_play to load vars for managed_node2 27712 1727096495.34011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096495.34856: done with get_vars() 27712 1727096495.34874: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 09:01:35 -0400 (0:00:00.054) 0:00:21.042 ****** 27712 1727096495.34924: entering _queue_task() for managed_node2/setup 27712 1727096495.35170: worker is 1 (out of 1 available) 27712 1727096495.35183: exiting _queue_task() for managed_node2/setup 27712 1727096495.35195: done queuing things up, now waiting for results queue to drain 27712 1727096495.35196: waiting for pending results... 27712 1727096495.35372: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27712 1727096495.35469: in run() - task 0afff68d-5257-cbc7-8716-000000000563 27712 1727096495.35484: variable 'ansible_search_path' from source: unknown 27712 1727096495.35487: variable 'ansible_search_path' from source: unknown 27712 1727096495.35514: calling self._execute() 27712 1727096495.35590: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096495.35594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096495.35602: variable 'omit' from source: magic vars 27712 1727096495.35877: variable 'ansible_distribution_major_version' from source: facts 27712 1727096495.35887: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096495.36028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096495.37605: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096495.37657: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096495.37688: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096495.37716: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096495.37737: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096495.37797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096495.37821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096495.37839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096495.37866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096495.37881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096495.37918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096495.37937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096495.37954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096495.37983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096495.37993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096495.38101: variable '__network_required_facts' from source: role '' defaults 27712 1727096495.38107: variable 'ansible_facts' from source: unknown 27712 1727096495.38530: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 27712 1727096495.38534: when evaluation is False, skipping this task 27712 1727096495.38537: _execute() done 27712 1727096495.38539: dumping result to json 27712 1727096495.38542: done dumping result, returning 27712 1727096495.38548: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-cbc7-8716-000000000563] 27712 1727096495.38552: sending task result for task 0afff68d-5257-cbc7-8716-000000000563 27712 1727096495.38638: done sending task result for task 0afff68d-5257-cbc7-8716-000000000563 27712 1727096495.38641: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096495.38710: no more pending results, returning what we have 27712 1727096495.38714: results queue empty 27712 1727096495.38715: checking for any_errors_fatal 27712 1727096495.38716: done checking for any_errors_fatal 27712 1727096495.38717: checking for max_fail_percentage 27712 1727096495.38718: done checking for max_fail_percentage 27712 1727096495.38719: checking to see if all hosts have failed and the running result is not ok 27712 1727096495.38720: done checking to see if all hosts have failed 27712 1727096495.38720: getting the remaining hosts for this loop 27712 1727096495.38721: done getting the remaining hosts for this loop 27712 1727096495.38725: getting the next task for host managed_node2 27712 1727096495.38733: done getting next task for host managed_node2 27712 1727096495.38737: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 27712 1727096495.38741: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096495.38757: getting variables 27712 1727096495.38759: in VariableManager get_vars() 27712 1727096495.38800: Calling all_inventory to load vars for managed_node2 27712 1727096495.38803: Calling groups_inventory to load vars for managed_node2 27712 1727096495.38805: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096495.38814: Calling all_plugins_play to load vars for managed_node2 27712 1727096495.38816: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096495.38819: Calling groups_plugins_play to load vars for managed_node2 27712 1727096495.39699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096495.40565: done with get_vars() 27712 1727096495.40584: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 09:01:35 -0400 (0:00:00.057) 0:00:21.099 ****** 27712 1727096495.40656: entering _queue_task() for managed_node2/stat 27712 1727096495.40897: worker is 1 (out of 1 available) 27712 1727096495.40911: exiting _queue_task() for managed_node2/stat 27712 1727096495.40921: done queuing things up, now waiting for results queue to drain 27712 1727096495.40923: waiting for pending results... 27712 1727096495.41105: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 27712 1727096495.41208: in run() - task 0afff68d-5257-cbc7-8716-000000000565 27712 1727096495.41219: variable 'ansible_search_path' from source: unknown 27712 1727096495.41223: variable 'ansible_search_path' from source: unknown 27712 1727096495.41253: calling self._execute() 27712 1727096495.41330: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096495.41334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096495.41342: variable 'omit' from source: magic vars 27712 1727096495.41625: variable 'ansible_distribution_major_version' from source: facts 27712 1727096495.41634: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096495.41752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096495.41948: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096495.41984: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096495.42009: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096495.42037: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096495.42100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096495.42118: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096495.42138: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096495.42156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096495.42222: variable '__network_is_ostree' from source: set_fact 27712 1727096495.42228: Evaluated conditional (not __network_is_ostree is defined): False 27712 1727096495.42231: when evaluation is False, skipping this task 27712 1727096495.42236: _execute() done 27712 1727096495.42238: dumping result to json 27712 1727096495.42240: done dumping result, returning 27712 1727096495.42252: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-cbc7-8716-000000000565] 27712 1727096495.42255: sending task result for task 0afff68d-5257-cbc7-8716-000000000565 27712 1727096495.42336: done sending task result for task 0afff68d-5257-cbc7-8716-000000000565 27712 1727096495.42338: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27712 1727096495.42397: no more pending results, returning what we have 27712 1727096495.42400: results queue empty 27712 1727096495.42401: checking for any_errors_fatal 27712 1727096495.42408: done checking for any_errors_fatal 27712 1727096495.42409: checking for max_fail_percentage 27712 1727096495.42410: done checking for max_fail_percentage 27712 1727096495.42411: checking to see if all hosts have failed and the running result is not ok 27712 1727096495.42412: done checking to see if all hosts have failed 27712 1727096495.42413: getting the remaining hosts for this loop 27712 1727096495.42414: done getting the remaining hosts for this loop 27712 1727096495.42418: getting the next task for host managed_node2 27712 1727096495.42424: done getting next task for host managed_node2 27712 1727096495.42427: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27712 1727096495.42431: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096495.42447: getting variables 27712 1727096495.42448: in VariableManager get_vars() 27712 1727096495.42486: Calling all_inventory to load vars for managed_node2 27712 1727096495.42488: Calling groups_inventory to load vars for managed_node2 27712 1727096495.42490: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096495.42499: Calling all_plugins_play to load vars for managed_node2 27712 1727096495.42501: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096495.42503: Calling groups_plugins_play to load vars for managed_node2 27712 1727096495.43296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096495.44154: done with get_vars() 27712 1727096495.44173: done getting variables 27712 1727096495.44215: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 09:01:35 -0400 (0:00:00.035) 0:00:21.135 ****** 27712 1727096495.44241: entering _queue_task() for managed_node2/set_fact 27712 1727096495.44482: worker is 1 (out of 1 available) 27712 1727096495.44495: exiting _queue_task() for managed_node2/set_fact 27712 1727096495.44508: done queuing things up, now waiting for results queue to drain 27712 1727096495.44510: waiting for pending results... 27712 1727096495.44687: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27712 1727096495.44794: in run() - task 0afff68d-5257-cbc7-8716-000000000566 27712 1727096495.44806: variable 'ansible_search_path' from source: unknown 27712 1727096495.44810: variable 'ansible_search_path' from source: unknown 27712 1727096495.44839: calling self._execute() 27712 1727096495.44914: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096495.44918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096495.44927: variable 'omit' from source: magic vars 27712 1727096495.45196: variable 'ansible_distribution_major_version' from source: facts 27712 1727096495.45206: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096495.45320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096495.45510: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096495.45542: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096495.45569: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096495.45597: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096495.45657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096495.45678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096495.45696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096495.45719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096495.45783: variable '__network_is_ostree' from source: set_fact 27712 1727096495.45789: Evaluated conditional (not __network_is_ostree is defined): False 27712 1727096495.45792: when evaluation is False, skipping this task 27712 1727096495.45794: _execute() done 27712 1727096495.45797: dumping result to json 27712 1727096495.45801: done dumping result, returning 27712 1727096495.45808: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-cbc7-8716-000000000566] 27712 1727096495.45812: sending task result for task 0afff68d-5257-cbc7-8716-000000000566 27712 1727096495.45896: done sending task result for task 0afff68d-5257-cbc7-8716-000000000566 27712 1727096495.45898: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27712 1727096495.45976: no more pending results, returning what we have 27712 1727096495.45979: results queue empty 27712 1727096495.45980: checking for any_errors_fatal 27712 1727096495.45987: done checking for any_errors_fatal 27712 1727096495.45988: checking for max_fail_percentage 27712 1727096495.45989: done checking for max_fail_percentage 27712 1727096495.45990: checking to see if all hosts have failed and the running result is not ok 27712 1727096495.45991: done checking to see if all hosts have failed 27712 1727096495.45992: getting the remaining hosts for this loop 27712 1727096495.45993: done getting the remaining hosts for this loop 27712 1727096495.45997: getting the next task for host managed_node2 27712 1727096495.46006: done getting next task for host managed_node2 27712 1727096495.46009: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 27712 1727096495.46013: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096495.46027: getting variables 27712 1727096495.46028: in VariableManager get_vars() 27712 1727096495.46059: Calling all_inventory to load vars for managed_node2 27712 1727096495.46061: Calling groups_inventory to load vars for managed_node2 27712 1727096495.46063: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096495.46076: Calling all_plugins_play to load vars for managed_node2 27712 1727096495.46078: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096495.46081: Calling groups_plugins_play to load vars for managed_node2 27712 1727096495.46927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096495.47794: done with get_vars() 27712 1727096495.47808: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 09:01:35 -0400 (0:00:00.036) 0:00:21.172 ****** 27712 1727096495.47876: entering _queue_task() for managed_node2/service_facts 27712 1727096495.48098: worker is 1 (out of 1 available) 27712 1727096495.48113: exiting _queue_task() for managed_node2/service_facts 27712 1727096495.48124: done queuing things up, now waiting for results queue to drain 27712 1727096495.48125: waiting for pending results... 27712 1727096495.48289: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 27712 1727096495.48384: in run() - task 0afff68d-5257-cbc7-8716-000000000568 27712 1727096495.48396: variable 'ansible_search_path' from source: unknown 27712 1727096495.48399: variable 'ansible_search_path' from source: unknown 27712 1727096495.48425: calling self._execute() 27712 1727096495.48501: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096495.48505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096495.48514: variable 'omit' from source: magic vars 27712 1727096495.48780: variable 'ansible_distribution_major_version' from source: facts 27712 1727096495.48790: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096495.48800: variable 'omit' from source: magic vars 27712 1727096495.48844: variable 'omit' from source: magic vars 27712 1727096495.48869: variable 'omit' from source: magic vars 27712 1727096495.48903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096495.48933: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096495.48948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096495.48962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096495.48975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096495.48995: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096495.48999: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096495.49002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096495.49073: Set connection var ansible_connection to ssh 27712 1727096495.49077: Set connection var ansible_pipelining to False 27712 1727096495.49082: Set connection var ansible_timeout to 10 27712 1727096495.49085: Set connection var ansible_shell_type to sh 27712 1727096495.49091: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096495.49096: Set connection var ansible_shell_executable to /bin/sh 27712 1727096495.49113: variable 'ansible_shell_executable' from source: unknown 27712 1727096495.49116: variable 'ansible_connection' from source: unknown 27712 1727096495.49119: variable 'ansible_module_compression' from source: unknown 27712 1727096495.49123: variable 'ansible_shell_type' from source: unknown 27712 1727096495.49126: variable 'ansible_shell_executable' from source: unknown 27712 1727096495.49128: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096495.49130: variable 'ansible_pipelining' from source: unknown 27712 1727096495.49132: variable 'ansible_timeout' from source: unknown 27712 1727096495.49134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096495.49273: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096495.49280: variable 'omit' from source: magic vars 27712 1727096495.49285: starting attempt loop 27712 1727096495.49288: running the handler 27712 1727096495.49298: _low_level_execute_command(): starting 27712 1727096495.49305: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096495.49810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096495.49814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096495.49817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096495.49820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096495.49877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096495.49881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096495.49883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096495.49926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096495.51604: stdout chunk (state=3): >>>/root <<< 27712 1727096495.51706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096495.51735: stderr chunk (state=3): >>><<< 27712 1727096495.51738: stdout chunk (state=3): >>><<< 27712 1727096495.51759: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096495.51777: _low_level_execute_command(): starting 27712 1727096495.51785: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096495.5175889-28762-209974826259818 `" && echo ansible-tmp-1727096495.5175889-28762-209974826259818="` echo /root/.ansible/tmp/ansible-tmp-1727096495.5175889-28762-209974826259818 `" ) && sleep 0' 27712 1727096495.52238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096495.52241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096495.52252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096495.52255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096495.52257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096495.52298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096495.52305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096495.52308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096495.52341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096495.54241: stdout chunk (state=3): >>>ansible-tmp-1727096495.5175889-28762-209974826259818=/root/.ansible/tmp/ansible-tmp-1727096495.5175889-28762-209974826259818 <<< 27712 1727096495.54346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096495.54377: stderr chunk (state=3): >>><<< 27712 1727096495.54381: stdout chunk (state=3): >>><<< 27712 1727096495.54398: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096495.5175889-28762-209974826259818=/root/.ansible/tmp/ansible-tmp-1727096495.5175889-28762-209974826259818 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096495.54435: variable 'ansible_module_compression' from source: unknown 27712 1727096495.54470: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 27712 1727096495.54508: variable 'ansible_facts' from source: unknown 27712 1727096495.54556: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096495.5175889-28762-209974826259818/AnsiballZ_service_facts.py 27712 1727096495.54660: Sending initial data 27712 1727096495.54664: Sent initial data (162 bytes) 27712 1727096495.55262: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096495.55295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096495.55303: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096495.55306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096495.55336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096495.56959: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096495.56996: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096495.57061: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpbz0qjz5m /root/.ansible/tmp/ansible-tmp-1727096495.5175889-28762-209974826259818/AnsiballZ_service_facts.py <<< 27712 1727096495.57064: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096495.5175889-28762-209974826259818/AnsiballZ_service_facts.py" <<< 27712 1727096495.57278: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpbz0qjz5m" to remote "/root/.ansible/tmp/ansible-tmp-1727096495.5175889-28762-209974826259818/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096495.5175889-28762-209974826259818/AnsiballZ_service_facts.py" <<< 27712 1727096495.57869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096495.57893: stderr chunk (state=3): >>><<< 27712 1727096495.57910: stdout chunk (state=3): >>><<< 27712 1727096495.58009: done transferring module to remote 27712 1727096495.58027: _low_level_execute_command(): starting 27712 1727096495.58036: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096495.5175889-28762-209974826259818/ /root/.ansible/tmp/ansible-tmp-1727096495.5175889-28762-209974826259818/AnsiballZ_service_facts.py && sleep 0' 27712 1727096495.58696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096495.58709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096495.58783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096495.58840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096495.58863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096495.58915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096495.58943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096495.60742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096495.60765: stderr chunk (state=3): >>><<< 27712 1727096495.60773: stdout chunk (state=3): >>><<< 27712 1727096495.60787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096495.60790: _low_level_execute_command(): starting 27712 1727096495.60794: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096495.5175889-28762-209974826259818/AnsiballZ_service_facts.py && sleep 0' 27712 1727096495.61223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096495.61227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096495.61229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096495.61231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096495.61288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096495.61292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096495.61330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096497.20822: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 27712 1727096497.22085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096497.22273: stderr chunk (state=3): >>><<< 27712 1727096497.22277: stdout chunk (state=3): >>><<< 27712 1727096497.22285: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096497.24209: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096495.5175889-28762-209974826259818/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096497.24387: _low_level_execute_command(): starting 27712 1727096497.24399: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096495.5175889-28762-209974826259818/ > /dev/null 2>&1 && sleep 0' 27712 1727096497.25984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096497.26373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096497.26484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096497.26513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096497.28432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096497.28435: stdout chunk (state=3): >>><<< 27712 1727096497.28438: stderr chunk (state=3): >>><<< 27712 1727096497.28453: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096497.28463: handler run complete 27712 1727096497.28720: variable 'ansible_facts' from source: unknown 27712 1727096497.29375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096497.30164: variable 'ansible_facts' from source: unknown 27712 1727096497.30495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096497.30938: attempt loop complete, returning result 27712 1727096497.30949: _execute() done 27712 1727096497.30957: dumping result to json 27712 1727096497.31105: done dumping result, returning 27712 1727096497.31123: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-cbc7-8716-000000000568] 27712 1727096497.31133: sending task result for task 0afff68d-5257-cbc7-8716-000000000568 27712 1727096497.33282: done sending task result for task 0afff68d-5257-cbc7-8716-000000000568 27712 1727096497.33285: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096497.33357: no more pending results, returning what we have 27712 1727096497.33359: results queue empty 27712 1727096497.33360: checking for any_errors_fatal 27712 1727096497.33364: done checking for any_errors_fatal 27712 1727096497.33364: checking for max_fail_percentage 27712 1727096497.33366: done checking for max_fail_percentage 27712 1727096497.33366: checking to see if all hosts have failed and the running result is not ok 27712 1727096497.33369: done checking to see if all hosts have failed 27712 1727096497.33370: getting the remaining hosts for this loop 27712 1727096497.33373: done getting the remaining hosts for this loop 27712 1727096497.33376: getting the next task for host managed_node2 27712 1727096497.33381: done getting next task for host managed_node2 27712 1727096497.33384: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 27712 1727096497.33388: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096497.33397: getting variables 27712 1727096497.33398: in VariableManager get_vars() 27712 1727096497.33427: Calling all_inventory to load vars for managed_node2 27712 1727096497.33429: Calling groups_inventory to load vars for managed_node2 27712 1727096497.33431: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096497.33439: Calling all_plugins_play to load vars for managed_node2 27712 1727096497.33442: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096497.33444: Calling groups_plugins_play to load vars for managed_node2 27712 1727096497.35596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096497.37251: done with get_vars() 27712 1727096497.37279: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 09:01:37 -0400 (0:00:01.894) 0:00:23.067 ****** 27712 1727096497.37377: entering _queue_task() for managed_node2/package_facts 27712 1727096497.37719: worker is 1 (out of 1 available) 27712 1727096497.37732: exiting _queue_task() for managed_node2/package_facts 27712 1727096497.37744: done queuing things up, now waiting for results queue to drain 27712 1727096497.37746: waiting for pending results... 27712 1727096497.38096: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 27712 1727096497.38271: in run() - task 0afff68d-5257-cbc7-8716-000000000569 27712 1727096497.38297: variable 'ansible_search_path' from source: unknown 27712 1727096497.38305: variable 'ansible_search_path' from source: unknown 27712 1727096497.38343: calling self._execute() 27712 1727096497.38443: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096497.38455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096497.38511: variable 'omit' from source: magic vars 27712 1727096497.39077: variable 'ansible_distribution_major_version' from source: facts 27712 1727096497.39195: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096497.39281: variable 'omit' from source: magic vars 27712 1727096497.39363: variable 'omit' from source: magic vars 27712 1727096497.39436: variable 'omit' from source: magic vars 27712 1727096497.39548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096497.39652: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096497.39678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096497.39700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096497.39739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096497.39840: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096497.39850: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096497.39858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096497.40006: Set connection var ansible_connection to ssh 27712 1727096497.40020: Set connection var ansible_pipelining to False 27712 1727096497.40029: Set connection var ansible_timeout to 10 27712 1727096497.40035: Set connection var ansible_shell_type to sh 27712 1727096497.40050: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096497.40060: Set connection var ansible_shell_executable to /bin/sh 27712 1727096497.40088: variable 'ansible_shell_executable' from source: unknown 27712 1727096497.40096: variable 'ansible_connection' from source: unknown 27712 1727096497.40102: variable 'ansible_module_compression' from source: unknown 27712 1727096497.40109: variable 'ansible_shell_type' from source: unknown 27712 1727096497.40124: variable 'ansible_shell_executable' from source: unknown 27712 1727096497.40132: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096497.40140: variable 'ansible_pipelining' from source: unknown 27712 1727096497.40146: variable 'ansible_timeout' from source: unknown 27712 1727096497.40158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096497.40351: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096497.40369: variable 'omit' from source: magic vars 27712 1727096497.40382: starting attempt loop 27712 1727096497.40388: running the handler 27712 1727096497.40405: _low_level_execute_command(): starting 27712 1727096497.40417: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096497.41136: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096497.41154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096497.41199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096497.41234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096497.41372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096497.43065: stdout chunk (state=3): >>>/root <<< 27712 1727096497.43374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096497.43379: stdout chunk (state=3): >>><<< 27712 1727096497.43382: stderr chunk (state=3): >>><<< 27712 1727096497.43384: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096497.43387: _low_level_execute_command(): starting 27712 1727096497.43390: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096497.4330919-28824-6411035561137 `" && echo ansible-tmp-1727096497.4330919-28824-6411035561137="` echo /root/.ansible/tmp/ansible-tmp-1727096497.4330919-28824-6411035561137 `" ) && sleep 0' 27712 1727096497.44011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096497.44033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096497.44083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096497.44150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096497.44165: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096497.44198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096497.44253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096497.46193: stdout chunk (state=3): >>>ansible-tmp-1727096497.4330919-28824-6411035561137=/root/.ansible/tmp/ansible-tmp-1727096497.4330919-28824-6411035561137 <<< 27712 1727096497.46379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096497.46440: stderr chunk (state=3): >>><<< 27712 1727096497.46444: stdout chunk (state=3): >>><<< 27712 1727096497.46481: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096497.4330919-28824-6411035561137=/root/.ansible/tmp/ansible-tmp-1727096497.4330919-28824-6411035561137 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096497.46530: variable 'ansible_module_compression' from source: unknown 27712 1727096497.46658: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 27712 1727096497.46662: variable 'ansible_facts' from source: unknown 27712 1727096497.46879: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096497.4330919-28824-6411035561137/AnsiballZ_package_facts.py 27712 1727096497.47026: Sending initial data 27712 1727096497.47103: Sent initial data (160 bytes) 27712 1727096497.48076: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096497.48096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096497.48116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096497.48219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096497.49825: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 27712 1727096497.49849: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 27712 1727096497.49888: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096497.49910: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096497.49976: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpma_6d9p1 /root/.ansible/tmp/ansible-tmp-1727096497.4330919-28824-6411035561137/AnsiballZ_package_facts.py <<< 27712 1727096497.49980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096497.4330919-28824-6411035561137/AnsiballZ_package_facts.py" <<< 27712 1727096497.50050: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpma_6d9p1" to remote "/root/.ansible/tmp/ansible-tmp-1727096497.4330919-28824-6411035561137/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096497.4330919-28824-6411035561137/AnsiballZ_package_facts.py" <<< 27712 1727096497.51692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096497.51696: stdout chunk (state=3): >>><<< 27712 1727096497.51698: stderr chunk (state=3): >>><<< 27712 1727096497.51749: done transferring module to remote 27712 1727096497.51773: _low_level_execute_command(): starting 27712 1727096497.51796: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096497.4330919-28824-6411035561137/ /root/.ansible/tmp/ansible-tmp-1727096497.4330919-28824-6411035561137/AnsiballZ_package_facts.py && sleep 0' 27712 1727096497.52459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096497.52482: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096497.52578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096497.52640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096497.52654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096497.54557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096497.54561: stdout chunk (state=3): >>><<< 27712 1727096497.54587: stderr chunk (state=3): >>><<< 27712 1727096497.54686: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096497.54689: _low_level_execute_command(): starting 27712 1727096497.54692: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096497.4330919-28824-6411035561137/AnsiballZ_package_facts.py && sleep 0' 27712 1727096497.55328: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096497.55343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096497.55364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096497.55452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096498.00348: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 27712 1727096498.00422: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 27712 1727096498.00540: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 27712 1727096498.00574: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 27712 1727096498.00627: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 27712 1727096498.02317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096498.02350: stderr chunk (state=3): >>><<< 27712 1727096498.02354: stdout chunk (state=3): >>><<< 27712 1727096498.02410: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096498.03708: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096497.4330919-28824-6411035561137/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096498.03725: _low_level_execute_command(): starting 27712 1727096498.03728: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096497.4330919-28824-6411035561137/ > /dev/null 2>&1 && sleep 0' 27712 1727096498.04385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096498.04444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096498.04461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096498.04485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096498.04555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096498.06395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096498.06419: stderr chunk (state=3): >>><<< 27712 1727096498.06423: stdout chunk (state=3): >>><<< 27712 1727096498.06436: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096498.06439: handler run complete 27712 1727096498.07172: variable 'ansible_facts' from source: unknown 27712 1727096498.07506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096498.08894: variable 'ansible_facts' from source: unknown 27712 1727096498.09133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096498.09521: attempt loop complete, returning result 27712 1727096498.09537: _execute() done 27712 1727096498.09540: dumping result to json 27712 1727096498.09773: done dumping result, returning 27712 1727096498.09776: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-cbc7-8716-000000000569] 27712 1727096498.09778: sending task result for task 0afff68d-5257-cbc7-8716-000000000569 27712 1727096498.11687: done sending task result for task 0afff68d-5257-cbc7-8716-000000000569 27712 1727096498.11690: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096498.11780: no more pending results, returning what we have 27712 1727096498.11782: results queue empty 27712 1727096498.11783: checking for any_errors_fatal 27712 1727096498.11787: done checking for any_errors_fatal 27712 1727096498.11788: checking for max_fail_percentage 27712 1727096498.11788: done checking for max_fail_percentage 27712 1727096498.11789: checking to see if all hosts have failed and the running result is not ok 27712 1727096498.11790: done checking to see if all hosts have failed 27712 1727096498.11790: getting the remaining hosts for this loop 27712 1727096498.11791: done getting the remaining hosts for this loop 27712 1727096498.11793: getting the next task for host managed_node2 27712 1727096498.11798: done getting next task for host managed_node2 27712 1727096498.11801: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 27712 1727096498.11803: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096498.11810: getting variables 27712 1727096498.11810: in VariableManager get_vars() 27712 1727096498.11835: Calling all_inventory to load vars for managed_node2 27712 1727096498.11837: Calling groups_inventory to load vars for managed_node2 27712 1727096498.11838: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096498.11844: Calling all_plugins_play to load vars for managed_node2 27712 1727096498.11847: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096498.11849: Calling groups_plugins_play to load vars for managed_node2 27712 1727096498.12561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096498.13841: done with get_vars() 27712 1727096498.13864: done getting variables 27712 1727096498.13944: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 09:01:38 -0400 (0:00:00.766) 0:00:23.833 ****** 27712 1727096498.13978: entering _queue_task() for managed_node2/debug 27712 1727096498.14220: worker is 1 (out of 1 available) 27712 1727096498.14234: exiting _queue_task() for managed_node2/debug 27712 1727096498.14246: done queuing things up, now waiting for results queue to drain 27712 1727096498.14248: waiting for pending results... 27712 1727096498.14421: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 27712 1727096498.14514: in run() - task 0afff68d-5257-cbc7-8716-00000000006d 27712 1727096498.14531: variable 'ansible_search_path' from source: unknown 27712 1727096498.14534: variable 'ansible_search_path' from source: unknown 27712 1727096498.14563: calling self._execute() 27712 1727096498.14641: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096498.14644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096498.14654: variable 'omit' from source: magic vars 27712 1727096498.14934: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.14944: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096498.14949: variable 'omit' from source: magic vars 27712 1727096498.14990: variable 'omit' from source: magic vars 27712 1727096498.15061: variable 'network_provider' from source: set_fact 27712 1727096498.15077: variable 'omit' from source: magic vars 27712 1727096498.15108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096498.15134: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096498.15152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096498.15166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096498.15179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096498.15201: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096498.15204: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096498.15207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096498.15280: Set connection var ansible_connection to ssh 27712 1727096498.15286: Set connection var ansible_pipelining to False 27712 1727096498.15292: Set connection var ansible_timeout to 10 27712 1727096498.15294: Set connection var ansible_shell_type to sh 27712 1727096498.15301: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096498.15305: Set connection var ansible_shell_executable to /bin/sh 27712 1727096498.15322: variable 'ansible_shell_executable' from source: unknown 27712 1727096498.15325: variable 'ansible_connection' from source: unknown 27712 1727096498.15327: variable 'ansible_module_compression' from source: unknown 27712 1727096498.15330: variable 'ansible_shell_type' from source: unknown 27712 1727096498.15332: variable 'ansible_shell_executable' from source: unknown 27712 1727096498.15334: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096498.15338: variable 'ansible_pipelining' from source: unknown 27712 1727096498.15340: variable 'ansible_timeout' from source: unknown 27712 1727096498.15344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096498.15442: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096498.15454: variable 'omit' from source: magic vars 27712 1727096498.15457: starting attempt loop 27712 1727096498.15459: running the handler 27712 1727096498.15501: handler run complete 27712 1727096498.15511: attempt loop complete, returning result 27712 1727096498.15514: _execute() done 27712 1727096498.15516: dumping result to json 27712 1727096498.15518: done dumping result, returning 27712 1727096498.15526: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-cbc7-8716-00000000006d] 27712 1727096498.15529: sending task result for task 0afff68d-5257-cbc7-8716-00000000006d 27712 1727096498.15607: done sending task result for task 0afff68d-5257-cbc7-8716-00000000006d 27712 1727096498.15610: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 27712 1727096498.15673: no more pending results, returning what we have 27712 1727096498.15676: results queue empty 27712 1727096498.15677: checking for any_errors_fatal 27712 1727096498.15688: done checking for any_errors_fatal 27712 1727096498.15689: checking for max_fail_percentage 27712 1727096498.15690: done checking for max_fail_percentage 27712 1727096498.15691: checking to see if all hosts have failed and the running result is not ok 27712 1727096498.15692: done checking to see if all hosts have failed 27712 1727096498.15692: getting the remaining hosts for this loop 27712 1727096498.15694: done getting the remaining hosts for this loop 27712 1727096498.15697: getting the next task for host managed_node2 27712 1727096498.15703: done getting next task for host managed_node2 27712 1727096498.15706: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27712 1727096498.15709: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096498.15723: getting variables 27712 1727096498.15724: in VariableManager get_vars() 27712 1727096498.15759: Calling all_inventory to load vars for managed_node2 27712 1727096498.15762: Calling groups_inventory to load vars for managed_node2 27712 1727096498.15764: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096498.15780: Calling all_plugins_play to load vars for managed_node2 27712 1727096498.15783: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096498.15785: Calling groups_plugins_play to load vars for managed_node2 27712 1727096498.16558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096498.17491: done with get_vars() 27712 1727096498.17509: done getting variables 27712 1727096498.17548: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 09:01:38 -0400 (0:00:00.035) 0:00:23.869 ****** 27712 1727096498.17573: entering _queue_task() for managed_node2/fail 27712 1727096498.17792: worker is 1 (out of 1 available) 27712 1727096498.17804: exiting _queue_task() for managed_node2/fail 27712 1727096498.17815: done queuing things up, now waiting for results queue to drain 27712 1727096498.17817: waiting for pending results... 27712 1727096498.17992: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27712 1727096498.18081: in run() - task 0afff68d-5257-cbc7-8716-00000000006e 27712 1727096498.18093: variable 'ansible_search_path' from source: unknown 27712 1727096498.18097: variable 'ansible_search_path' from source: unknown 27712 1727096498.18124: calling self._execute() 27712 1727096498.18204: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096498.18209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096498.18217: variable 'omit' from source: magic vars 27712 1727096498.18492: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.18502: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096498.18587: variable 'network_state' from source: role '' defaults 27712 1727096498.18598: Evaluated conditional (network_state != {}): False 27712 1727096498.18602: when evaluation is False, skipping this task 27712 1727096498.18604: _execute() done 27712 1727096498.18607: dumping result to json 27712 1727096498.18609: done dumping result, returning 27712 1727096498.18613: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-cbc7-8716-00000000006e] 27712 1727096498.18619: sending task result for task 0afff68d-5257-cbc7-8716-00000000006e 27712 1727096498.18702: done sending task result for task 0afff68d-5257-cbc7-8716-00000000006e 27712 1727096498.18705: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096498.18750: no more pending results, returning what we have 27712 1727096498.18753: results queue empty 27712 1727096498.18754: checking for any_errors_fatal 27712 1727096498.18763: done checking for any_errors_fatal 27712 1727096498.18763: checking for max_fail_percentage 27712 1727096498.18765: done checking for max_fail_percentage 27712 1727096498.18766: checking to see if all hosts have failed and the running result is not ok 27712 1727096498.18766: done checking to see if all hosts have failed 27712 1727096498.18769: getting the remaining hosts for this loop 27712 1727096498.18771: done getting the remaining hosts for this loop 27712 1727096498.18774: getting the next task for host managed_node2 27712 1727096498.18779: done getting next task for host managed_node2 27712 1727096498.18783: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27712 1727096498.18785: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096498.18801: getting variables 27712 1727096498.18802: in VariableManager get_vars() 27712 1727096498.18833: Calling all_inventory to load vars for managed_node2 27712 1727096498.18836: Calling groups_inventory to load vars for managed_node2 27712 1727096498.18838: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096498.18846: Calling all_plugins_play to load vars for managed_node2 27712 1727096498.18848: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096498.18850: Calling groups_plugins_play to load vars for managed_node2 27712 1727096498.19587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096498.20429: done with get_vars() 27712 1727096498.20445: done getting variables 27712 1727096498.20488: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 09:01:38 -0400 (0:00:00.029) 0:00:23.898 ****** 27712 1727096498.20514: entering _queue_task() for managed_node2/fail 27712 1727096498.20733: worker is 1 (out of 1 available) 27712 1727096498.20747: exiting _queue_task() for managed_node2/fail 27712 1727096498.20758: done queuing things up, now waiting for results queue to drain 27712 1727096498.20759: waiting for pending results... 27712 1727096498.20939: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27712 1727096498.21029: in run() - task 0afff68d-5257-cbc7-8716-00000000006f 27712 1727096498.21042: variable 'ansible_search_path' from source: unknown 27712 1727096498.21046: variable 'ansible_search_path' from source: unknown 27712 1727096498.21077: calling self._execute() 27712 1727096498.21151: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096498.21154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096498.21162: variable 'omit' from source: magic vars 27712 1727096498.21438: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.21448: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096498.21534: variable 'network_state' from source: role '' defaults 27712 1727096498.21542: Evaluated conditional (network_state != {}): False 27712 1727096498.21544: when evaluation is False, skipping this task 27712 1727096498.21547: _execute() done 27712 1727096498.21550: dumping result to json 27712 1727096498.21552: done dumping result, returning 27712 1727096498.21559: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-cbc7-8716-00000000006f] 27712 1727096498.21562: sending task result for task 0afff68d-5257-cbc7-8716-00000000006f 27712 1727096498.21647: done sending task result for task 0afff68d-5257-cbc7-8716-00000000006f 27712 1727096498.21650: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096498.21704: no more pending results, returning what we have 27712 1727096498.21708: results queue empty 27712 1727096498.21709: checking for any_errors_fatal 27712 1727096498.21717: done checking for any_errors_fatal 27712 1727096498.21717: checking for max_fail_percentage 27712 1727096498.21719: done checking for max_fail_percentage 27712 1727096498.21719: checking to see if all hosts have failed and the running result is not ok 27712 1727096498.21720: done checking to see if all hosts have failed 27712 1727096498.21721: getting the remaining hosts for this loop 27712 1727096498.21722: done getting the remaining hosts for this loop 27712 1727096498.21725: getting the next task for host managed_node2 27712 1727096498.21732: done getting next task for host managed_node2 27712 1727096498.21735: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27712 1727096498.21738: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096498.21753: getting variables 27712 1727096498.21754: in VariableManager get_vars() 27712 1727096498.21797: Calling all_inventory to load vars for managed_node2 27712 1727096498.21800: Calling groups_inventory to load vars for managed_node2 27712 1727096498.21802: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096498.21811: Calling all_plugins_play to load vars for managed_node2 27712 1727096498.21813: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096498.21816: Calling groups_plugins_play to load vars for managed_node2 27712 1727096498.22677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096498.23523: done with get_vars() 27712 1727096498.23539: done getting variables 27712 1727096498.23583: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 09:01:38 -0400 (0:00:00.030) 0:00:23.929 ****** 27712 1727096498.23605: entering _queue_task() for managed_node2/fail 27712 1727096498.23828: worker is 1 (out of 1 available) 27712 1727096498.23842: exiting _queue_task() for managed_node2/fail 27712 1727096498.23855: done queuing things up, now waiting for results queue to drain 27712 1727096498.23856: waiting for pending results... 27712 1727096498.24029: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27712 1727096498.24124: in run() - task 0afff68d-5257-cbc7-8716-000000000070 27712 1727096498.24135: variable 'ansible_search_path' from source: unknown 27712 1727096498.24138: variable 'ansible_search_path' from source: unknown 27712 1727096498.24165: calling self._execute() 27712 1727096498.24242: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096498.24247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096498.24256: variable 'omit' from source: magic vars 27712 1727096498.24525: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.24531: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096498.24651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096498.26127: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096498.26175: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096498.26202: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096498.26226: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096498.26246: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096498.26305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.26325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.26342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.26374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.26383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.26446: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.26458: Evaluated conditional (ansible_distribution_major_version | int > 9): True 27712 1727096498.26536: variable 'ansible_distribution' from source: facts 27712 1727096498.26539: variable '__network_rh_distros' from source: role '' defaults 27712 1727096498.26547: Evaluated conditional (ansible_distribution in __network_rh_distros): True 27712 1727096498.26708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.26725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.26742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.26766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.26780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.26817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.26831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.26847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.26874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.26884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.26913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.26931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.26947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.26973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.26983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.27181: variable 'network_connections' from source: task vars 27712 1727096498.27190: variable 'interface1' from source: play vars 27712 1727096498.27240: variable 'interface1' from source: play vars 27712 1727096498.27294: variable 'interface1_mac' from source: set_fact 27712 1727096498.27308: variable 'network_state' from source: role '' defaults 27712 1727096498.27353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096498.27460: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096498.27492: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096498.27515: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096498.27536: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096498.27566: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096498.27591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096498.27608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.27625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096498.27652: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 27712 1727096498.27656: when evaluation is False, skipping this task 27712 1727096498.27658: _execute() done 27712 1727096498.27660: dumping result to json 27712 1727096498.27663: done dumping result, returning 27712 1727096498.27676: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-cbc7-8716-000000000070] 27712 1727096498.27679: sending task result for task 0afff68d-5257-cbc7-8716-000000000070 27712 1727096498.27758: done sending task result for task 0afff68d-5257-cbc7-8716-000000000070 27712 1727096498.27760: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 27712 1727096498.27823: no more pending results, returning what we have 27712 1727096498.27826: results queue empty 27712 1727096498.27827: checking for any_errors_fatal 27712 1727096498.27835: done checking for any_errors_fatal 27712 1727096498.27836: checking for max_fail_percentage 27712 1727096498.27837: done checking for max_fail_percentage 27712 1727096498.27838: checking to see if all hosts have failed and the running result is not ok 27712 1727096498.27839: done checking to see if all hosts have failed 27712 1727096498.27839: getting the remaining hosts for this loop 27712 1727096498.27841: done getting the remaining hosts for this loop 27712 1727096498.27844: getting the next task for host managed_node2 27712 1727096498.27851: done getting next task for host managed_node2 27712 1727096498.27855: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27712 1727096498.27858: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096498.27878: getting variables 27712 1727096498.27880: in VariableManager get_vars() 27712 1727096498.27916: Calling all_inventory to load vars for managed_node2 27712 1727096498.27920: Calling groups_inventory to load vars for managed_node2 27712 1727096498.27922: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096498.27931: Calling all_plugins_play to load vars for managed_node2 27712 1727096498.27933: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096498.27936: Calling groups_plugins_play to load vars for managed_node2 27712 1727096498.28728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096498.29632: done with get_vars() 27712 1727096498.29655: done getting variables 27712 1727096498.29716: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 09:01:38 -0400 (0:00:00.061) 0:00:23.990 ****** 27712 1727096498.29746: entering _queue_task() for managed_node2/dnf 27712 1727096498.30242: worker is 1 (out of 1 available) 27712 1727096498.30256: exiting _queue_task() for managed_node2/dnf 27712 1727096498.30474: done queuing things up, now waiting for results queue to drain 27712 1727096498.30486: waiting for pending results... 27712 1727096498.30799: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27712 1727096498.30805: in run() - task 0afff68d-5257-cbc7-8716-000000000071 27712 1727096498.30819: variable 'ansible_search_path' from source: unknown 27712 1727096498.30828: variable 'ansible_search_path' from source: unknown 27712 1727096498.30873: calling self._execute() 27712 1727096498.30987: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096498.31006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096498.31022: variable 'omit' from source: magic vars 27712 1727096498.31418: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.31436: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096498.31637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096498.35513: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096498.35649: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096498.35698: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096498.35744: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096498.35784: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096498.35873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.35913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.35959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.36328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.36332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.36389: variable 'ansible_distribution' from source: facts 27712 1727096498.36403: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.36448: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 27712 1727096498.36579: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096498.36729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.36756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.36793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.36837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.36856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.36907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.36936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.36964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.37016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.37034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.37080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.37113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.37140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.37187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.37209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.37366: variable 'network_connections' from source: task vars 27712 1727096498.37388: variable 'interface1' from source: play vars 27712 1727096498.37463: variable 'interface1' from source: play vars 27712 1727096498.37551: variable 'interface1_mac' from source: set_fact 27712 1727096498.37637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096498.43909: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096498.44047: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096498.44050: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096498.44052: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096498.44085: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096498.44111: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096498.44150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.44187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096498.44241: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096498.44507: variable 'network_connections' from source: task vars 27712 1727096498.44517: variable 'interface1' from source: play vars 27712 1727096498.44590: variable 'interface1' from source: play vars 27712 1727096498.44674: variable 'interface1_mac' from source: set_fact 27712 1727096498.44720: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27712 1727096498.44775: when evaluation is False, skipping this task 27712 1727096498.44778: _execute() done 27712 1727096498.44780: dumping result to json 27712 1727096498.44782: done dumping result, returning 27712 1727096498.44785: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-cbc7-8716-000000000071] 27712 1727096498.44787: sending task result for task 0afff68d-5257-cbc7-8716-000000000071 27712 1727096498.45021: done sending task result for task 0afff68d-5257-cbc7-8716-000000000071 27712 1727096498.45024: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27712 1727096498.45073: no more pending results, returning what we have 27712 1727096498.45076: results queue empty 27712 1727096498.45077: checking for any_errors_fatal 27712 1727096498.45083: done checking for any_errors_fatal 27712 1727096498.45084: checking for max_fail_percentage 27712 1727096498.45086: done checking for max_fail_percentage 27712 1727096498.45086: checking to see if all hosts have failed and the running result is not ok 27712 1727096498.45087: done checking to see if all hosts have failed 27712 1727096498.45088: getting the remaining hosts for this loop 27712 1727096498.45089: done getting the remaining hosts for this loop 27712 1727096498.45093: getting the next task for host managed_node2 27712 1727096498.45099: done getting next task for host managed_node2 27712 1727096498.45103: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27712 1727096498.45106: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096498.45124: getting variables 27712 1727096498.45125: in VariableManager get_vars() 27712 1727096498.45163: Calling all_inventory to load vars for managed_node2 27712 1727096498.45165: Calling groups_inventory to load vars for managed_node2 27712 1727096498.45169: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096498.45181: Calling all_plugins_play to load vars for managed_node2 27712 1727096498.45183: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096498.45185: Calling groups_plugins_play to load vars for managed_node2 27712 1727096498.54546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096498.55656: done with get_vars() 27712 1727096498.55685: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27712 1727096498.55750: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 09:01:38 -0400 (0:00:00.260) 0:00:24.251 ****** 27712 1727096498.55779: entering _queue_task() for managed_node2/yum 27712 1727096498.56203: worker is 1 (out of 1 available) 27712 1727096498.56216: exiting _queue_task() for managed_node2/yum 27712 1727096498.56228: done queuing things up, now waiting for results queue to drain 27712 1727096498.56230: waiting for pending results... 27712 1727096498.56554: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27712 1727096498.56705: in run() - task 0afff68d-5257-cbc7-8716-000000000072 27712 1727096498.56784: variable 'ansible_search_path' from source: unknown 27712 1727096498.56788: variable 'ansible_search_path' from source: unknown 27712 1727096498.56791: calling self._execute() 27712 1727096498.56888: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096498.56904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096498.56920: variable 'omit' from source: magic vars 27712 1727096498.57331: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.57335: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096498.57507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096498.59487: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096498.59576: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096498.59676: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096498.59679: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096498.59682: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096498.59800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.59813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.59843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.59906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.59975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.60061: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.60086: Evaluated conditional (ansible_distribution_major_version | int < 8): False 27712 1727096498.60093: when evaluation is False, skipping this task 27712 1727096498.60100: _execute() done 27712 1727096498.60106: dumping result to json 27712 1727096498.60113: done dumping result, returning 27712 1727096498.60133: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-cbc7-8716-000000000072] 27712 1727096498.60142: sending task result for task 0afff68d-5257-cbc7-8716-000000000072 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 27712 1727096498.60443: no more pending results, returning what we have 27712 1727096498.60446: results queue empty 27712 1727096498.60447: checking for any_errors_fatal 27712 1727096498.60461: done checking for any_errors_fatal 27712 1727096498.60461: checking for max_fail_percentage 27712 1727096498.60463: done checking for max_fail_percentage 27712 1727096498.60464: checking to see if all hosts have failed and the running result is not ok 27712 1727096498.60464: done checking to see if all hosts have failed 27712 1727096498.60465: getting the remaining hosts for this loop 27712 1727096498.60466: done getting the remaining hosts for this loop 27712 1727096498.60473: getting the next task for host managed_node2 27712 1727096498.60480: done getting next task for host managed_node2 27712 1727096498.60483: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27712 1727096498.60486: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096498.60504: getting variables 27712 1727096498.60506: in VariableManager get_vars() 27712 1727096498.60555: Calling all_inventory to load vars for managed_node2 27712 1727096498.60563: Calling groups_inventory to load vars for managed_node2 27712 1727096498.60566: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096498.60576: done sending task result for task 0afff68d-5257-cbc7-8716-000000000072 27712 1727096498.60578: WORKER PROCESS EXITING 27712 1727096498.60586: Calling all_plugins_play to load vars for managed_node2 27712 1727096498.60589: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096498.60591: Calling groups_plugins_play to load vars for managed_node2 27712 1727096498.61861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096498.63146: done with get_vars() 27712 1727096498.63163: done getting variables 27712 1727096498.63221: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 09:01:38 -0400 (0:00:00.074) 0:00:24.325 ****** 27712 1727096498.63248: entering _queue_task() for managed_node2/fail 27712 1727096498.63563: worker is 1 (out of 1 available) 27712 1727096498.63737: exiting _queue_task() for managed_node2/fail 27712 1727096498.63748: done queuing things up, now waiting for results queue to drain 27712 1727096498.63749: waiting for pending results... 27712 1727096498.64188: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27712 1727096498.64423: in run() - task 0afff68d-5257-cbc7-8716-000000000073 27712 1727096498.64428: variable 'ansible_search_path' from source: unknown 27712 1727096498.64431: variable 'ansible_search_path' from source: unknown 27712 1727096498.64449: calling self._execute() 27712 1727096498.64684: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096498.64709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096498.64725: variable 'omit' from source: magic vars 27712 1727096498.65062: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.65077: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096498.65157: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096498.65296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096498.67176: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096498.67179: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096498.67220: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096498.67258: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096498.67293: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096498.67380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.67417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.67450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.67506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.67524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.67579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.67645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.67649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.67666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.67687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.67739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.67752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.67771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.67797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.67808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.67921: variable 'network_connections' from source: task vars 27712 1727096498.67931: variable 'interface1' from source: play vars 27712 1727096498.67988: variable 'interface1' from source: play vars 27712 1727096498.68047: variable 'interface1_mac' from source: set_fact 27712 1727096498.68107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096498.68215: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096498.68242: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096498.68274: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096498.68300: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096498.68330: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096498.68345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096498.68362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.68386: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096498.68432: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096498.68582: variable 'network_connections' from source: task vars 27712 1727096498.68585: variable 'interface1' from source: play vars 27712 1727096498.68630: variable 'interface1' from source: play vars 27712 1727096498.68682: variable 'interface1_mac' from source: set_fact 27712 1727096498.68709: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27712 1727096498.68712: when evaluation is False, skipping this task 27712 1727096498.68717: _execute() done 27712 1727096498.68719: dumping result to json 27712 1727096498.68721: done dumping result, returning 27712 1727096498.68731: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-cbc7-8716-000000000073] 27712 1727096498.68741: sending task result for task 0afff68d-5257-cbc7-8716-000000000073 27712 1727096498.68822: done sending task result for task 0afff68d-5257-cbc7-8716-000000000073 27712 1727096498.68825: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27712 1727096498.68881: no more pending results, returning what we have 27712 1727096498.68884: results queue empty 27712 1727096498.68885: checking for any_errors_fatal 27712 1727096498.68891: done checking for any_errors_fatal 27712 1727096498.68891: checking for max_fail_percentage 27712 1727096498.68893: done checking for max_fail_percentage 27712 1727096498.68894: checking to see if all hosts have failed and the running result is not ok 27712 1727096498.68894: done checking to see if all hosts have failed 27712 1727096498.68895: getting the remaining hosts for this loop 27712 1727096498.68897: done getting the remaining hosts for this loop 27712 1727096498.68900: getting the next task for host managed_node2 27712 1727096498.68906: done getting next task for host managed_node2 27712 1727096498.68910: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 27712 1727096498.68913: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096498.68929: getting variables 27712 1727096498.68931: in VariableManager get_vars() 27712 1727096498.68970: Calling all_inventory to load vars for managed_node2 27712 1727096498.68973: Calling groups_inventory to load vars for managed_node2 27712 1727096498.68976: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096498.68986: Calling all_plugins_play to load vars for managed_node2 27712 1727096498.68989: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096498.68991: Calling groups_plugins_play to load vars for managed_node2 27712 1727096498.69888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096498.70748: done with get_vars() 27712 1727096498.70763: done getting variables 27712 1727096498.70807: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 09:01:38 -0400 (0:00:00.075) 0:00:24.401 ****** 27712 1727096498.70832: entering _queue_task() for managed_node2/package 27712 1727096498.71060: worker is 1 (out of 1 available) 27712 1727096498.71075: exiting _queue_task() for managed_node2/package 27712 1727096498.71087: done queuing things up, now waiting for results queue to drain 27712 1727096498.71089: waiting for pending results... 27712 1727096498.71263: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 27712 1727096498.71355: in run() - task 0afff68d-5257-cbc7-8716-000000000074 27712 1727096498.71368: variable 'ansible_search_path' from source: unknown 27712 1727096498.71372: variable 'ansible_search_path' from source: unknown 27712 1727096498.71402: calling self._execute() 27712 1727096498.71482: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096498.71486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096498.71495: variable 'omit' from source: magic vars 27712 1727096498.71774: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.71786: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096498.71914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096498.72108: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096498.72142: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096498.72166: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096498.72233: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096498.72311: variable 'network_packages' from source: role '' defaults 27712 1727096498.72383: variable '__network_provider_setup' from source: role '' defaults 27712 1727096498.72391: variable '__network_service_name_default_nm' from source: role '' defaults 27712 1727096498.72442: variable '__network_service_name_default_nm' from source: role '' defaults 27712 1727096498.72451: variable '__network_packages_default_nm' from source: role '' defaults 27712 1727096498.72498: variable '__network_packages_default_nm' from source: role '' defaults 27712 1727096498.72612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096498.73919: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096498.73963: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096498.73993: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096498.74018: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096498.74038: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096498.74107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.74127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.74144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.74178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.74189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.74219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.74235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.74252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.74285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.74295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.74432: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27712 1727096498.74506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.74523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.74539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.74563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.74577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.74639: variable 'ansible_python' from source: facts 27712 1727096498.74659: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27712 1727096498.74719: variable '__network_wpa_supplicant_required' from source: role '' defaults 27712 1727096498.74773: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27712 1727096498.74855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.74873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.74893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.74924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.74930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.74962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.74985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.75002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.75032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.75038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.75134: variable 'network_connections' from source: task vars 27712 1727096498.75138: variable 'interface1' from source: play vars 27712 1727096498.75211: variable 'interface1' from source: play vars 27712 1727096498.75294: variable 'interface1_mac' from source: set_fact 27712 1727096498.75353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096498.75379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096498.75399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.75420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096498.75455: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096498.75634: variable 'network_connections' from source: task vars 27712 1727096498.75638: variable 'interface1' from source: play vars 27712 1727096498.75711: variable 'interface1' from source: play vars 27712 1727096498.75790: variable 'interface1_mac' from source: set_fact 27712 1727096498.75837: variable '__network_packages_default_wireless' from source: role '' defaults 27712 1727096498.75893: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096498.76089: variable 'network_connections' from source: task vars 27712 1727096498.76092: variable 'interface1' from source: play vars 27712 1727096498.76139: variable 'interface1' from source: play vars 27712 1727096498.76197: variable 'interface1_mac' from source: set_fact 27712 1727096498.76219: variable '__network_packages_default_team' from source: role '' defaults 27712 1727096498.76278: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096498.76466: variable 'network_connections' from source: task vars 27712 1727096498.76471: variable 'interface1' from source: play vars 27712 1727096498.76517: variable 'interface1' from source: play vars 27712 1727096498.76574: variable 'interface1_mac' from source: set_fact 27712 1727096498.76619: variable '__network_service_name_default_initscripts' from source: role '' defaults 27712 1727096498.76661: variable '__network_service_name_default_initscripts' from source: role '' defaults 27712 1727096498.76666: variable '__network_packages_default_initscripts' from source: role '' defaults 27712 1727096498.76715: variable '__network_packages_default_initscripts' from source: role '' defaults 27712 1727096498.76849: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27712 1727096498.77157: variable 'network_connections' from source: task vars 27712 1727096498.77160: variable 'interface1' from source: play vars 27712 1727096498.77207: variable 'interface1' from source: play vars 27712 1727096498.77257: variable 'interface1_mac' from source: set_fact 27712 1727096498.77269: variable 'ansible_distribution' from source: facts 27712 1727096498.77272: variable '__network_rh_distros' from source: role '' defaults 27712 1727096498.77280: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.77296: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27712 1727096498.77403: variable 'ansible_distribution' from source: facts 27712 1727096498.77407: variable '__network_rh_distros' from source: role '' defaults 27712 1727096498.77410: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.77421: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27712 1727096498.77530: variable 'ansible_distribution' from source: facts 27712 1727096498.77533: variable '__network_rh_distros' from source: role '' defaults 27712 1727096498.77536: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.77565: variable 'network_provider' from source: set_fact 27712 1727096498.77581: variable 'ansible_facts' from source: unknown 27712 1727096498.78107: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 27712 1727096498.78111: when evaluation is False, skipping this task 27712 1727096498.78113: _execute() done 27712 1727096498.78116: dumping result to json 27712 1727096498.78118: done dumping result, returning 27712 1727096498.78125: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-cbc7-8716-000000000074] 27712 1727096498.78130: sending task result for task 0afff68d-5257-cbc7-8716-000000000074 27712 1727096498.78215: done sending task result for task 0afff68d-5257-cbc7-8716-000000000074 27712 1727096498.78218: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 27712 1727096498.78270: no more pending results, returning what we have 27712 1727096498.78274: results queue empty 27712 1727096498.78275: checking for any_errors_fatal 27712 1727096498.78284: done checking for any_errors_fatal 27712 1727096498.78285: checking for max_fail_percentage 27712 1727096498.78286: done checking for max_fail_percentage 27712 1727096498.78287: checking to see if all hosts have failed and the running result is not ok 27712 1727096498.78288: done checking to see if all hosts have failed 27712 1727096498.78289: getting the remaining hosts for this loop 27712 1727096498.78290: done getting the remaining hosts for this loop 27712 1727096498.78293: getting the next task for host managed_node2 27712 1727096498.78299: done getting next task for host managed_node2 27712 1727096498.78303: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27712 1727096498.78305: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096498.78322: getting variables 27712 1727096498.78323: in VariableManager get_vars() 27712 1727096498.78363: Calling all_inventory to load vars for managed_node2 27712 1727096498.78366: Calling groups_inventory to load vars for managed_node2 27712 1727096498.78370: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096498.78384: Calling all_plugins_play to load vars for managed_node2 27712 1727096498.78387: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096498.78389: Calling groups_plugins_play to load vars for managed_node2 27712 1727096498.79190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096498.80160: done with get_vars() 27712 1727096498.80178: done getting variables 27712 1727096498.80223: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 09:01:38 -0400 (0:00:00.094) 0:00:24.495 ****** 27712 1727096498.80247: entering _queue_task() for managed_node2/package 27712 1727096498.80502: worker is 1 (out of 1 available) 27712 1727096498.80517: exiting _queue_task() for managed_node2/package 27712 1727096498.80530: done queuing things up, now waiting for results queue to drain 27712 1727096498.80531: waiting for pending results... 27712 1727096498.80713: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27712 1727096498.80819: in run() - task 0afff68d-5257-cbc7-8716-000000000075 27712 1727096498.80833: variable 'ansible_search_path' from source: unknown 27712 1727096498.80836: variable 'ansible_search_path' from source: unknown 27712 1727096498.80876: calling self._execute() 27712 1727096498.80952: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096498.80956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096498.80965: variable 'omit' from source: magic vars 27712 1727096498.81256: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.81265: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096498.81348: variable 'network_state' from source: role '' defaults 27712 1727096498.81356: Evaluated conditional (network_state != {}): False 27712 1727096498.81359: when evaluation is False, skipping this task 27712 1727096498.81361: _execute() done 27712 1727096498.81364: dumping result to json 27712 1727096498.81366: done dumping result, returning 27712 1727096498.81377: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-cbc7-8716-000000000075] 27712 1727096498.81380: sending task result for task 0afff68d-5257-cbc7-8716-000000000075 27712 1727096498.81475: done sending task result for task 0afff68d-5257-cbc7-8716-000000000075 27712 1727096498.81478: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096498.81524: no more pending results, returning what we have 27712 1727096498.81528: results queue empty 27712 1727096498.81529: checking for any_errors_fatal 27712 1727096498.81534: done checking for any_errors_fatal 27712 1727096498.81535: checking for max_fail_percentage 27712 1727096498.81537: done checking for max_fail_percentage 27712 1727096498.81537: checking to see if all hosts have failed and the running result is not ok 27712 1727096498.81538: done checking to see if all hosts have failed 27712 1727096498.81539: getting the remaining hosts for this loop 27712 1727096498.81540: done getting the remaining hosts for this loop 27712 1727096498.81543: getting the next task for host managed_node2 27712 1727096498.81549: done getting next task for host managed_node2 27712 1727096498.81552: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27712 1727096498.81555: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096498.81576: getting variables 27712 1727096498.81578: in VariableManager get_vars() 27712 1727096498.81612: Calling all_inventory to load vars for managed_node2 27712 1727096498.81614: Calling groups_inventory to load vars for managed_node2 27712 1727096498.81616: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096498.81624: Calling all_plugins_play to load vars for managed_node2 27712 1727096498.81626: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096498.81628: Calling groups_plugins_play to load vars for managed_node2 27712 1727096498.82824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096498.83766: done with get_vars() 27712 1727096498.83787: done getting variables 27712 1727096498.83827: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 09:01:38 -0400 (0:00:00.036) 0:00:24.531 ****** 27712 1727096498.83853: entering _queue_task() for managed_node2/package 27712 1727096498.84084: worker is 1 (out of 1 available) 27712 1727096498.84097: exiting _queue_task() for managed_node2/package 27712 1727096498.84110: done queuing things up, now waiting for results queue to drain 27712 1727096498.84111: waiting for pending results... 27712 1727096498.84286: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27712 1727096498.84369: in run() - task 0afff68d-5257-cbc7-8716-000000000076 27712 1727096498.84382: variable 'ansible_search_path' from source: unknown 27712 1727096498.84386: variable 'ansible_search_path' from source: unknown 27712 1727096498.84413: calling self._execute() 27712 1727096498.84495: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096498.84500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096498.84509: variable 'omit' from source: magic vars 27712 1727096498.84875: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.84879: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096498.84962: variable 'network_state' from source: role '' defaults 27712 1727096498.84982: Evaluated conditional (network_state != {}): False 27712 1727096498.84990: when evaluation is False, skipping this task 27712 1727096498.84997: _execute() done 27712 1727096498.85003: dumping result to json 27712 1727096498.85010: done dumping result, returning 27712 1727096498.85021: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-cbc7-8716-000000000076] 27712 1727096498.85031: sending task result for task 0afff68d-5257-cbc7-8716-000000000076 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096498.85309: no more pending results, returning what we have 27712 1727096498.85312: results queue empty 27712 1727096498.85313: checking for any_errors_fatal 27712 1727096498.85319: done checking for any_errors_fatal 27712 1727096498.85320: checking for max_fail_percentage 27712 1727096498.85321: done checking for max_fail_percentage 27712 1727096498.85322: checking to see if all hosts have failed and the running result is not ok 27712 1727096498.85323: done checking to see if all hosts have failed 27712 1727096498.85323: getting the remaining hosts for this loop 27712 1727096498.85325: done getting the remaining hosts for this loop 27712 1727096498.85327: getting the next task for host managed_node2 27712 1727096498.85333: done getting next task for host managed_node2 27712 1727096498.85336: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27712 1727096498.85338: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096498.85353: getting variables 27712 1727096498.85354: in VariableManager get_vars() 27712 1727096498.85393: Calling all_inventory to load vars for managed_node2 27712 1727096498.85395: Calling groups_inventory to load vars for managed_node2 27712 1727096498.85397: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096498.85403: done sending task result for task 0afff68d-5257-cbc7-8716-000000000076 27712 1727096498.85405: WORKER PROCESS EXITING 27712 1727096498.85412: Calling all_plugins_play to load vars for managed_node2 27712 1727096498.85415: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096498.85417: Calling groups_plugins_play to load vars for managed_node2 27712 1727096498.86990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096498.88586: done with get_vars() 27712 1727096498.88607: done getting variables 27712 1727096498.88661: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 09:01:38 -0400 (0:00:00.048) 0:00:24.580 ****** 27712 1727096498.88698: entering _queue_task() for managed_node2/service 27712 1727096498.88996: worker is 1 (out of 1 available) 27712 1727096498.89011: exiting _queue_task() for managed_node2/service 27712 1727096498.89024: done queuing things up, now waiting for results queue to drain 27712 1727096498.89026: waiting for pending results... 27712 1727096498.89338: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27712 1727096498.89497: in run() - task 0afff68d-5257-cbc7-8716-000000000077 27712 1727096498.89518: variable 'ansible_search_path' from source: unknown 27712 1727096498.89527: variable 'ansible_search_path' from source: unknown 27712 1727096498.89574: calling self._execute() 27712 1727096498.89684: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096498.89696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096498.89715: variable 'omit' from source: magic vars 27712 1727096498.90112: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.90128: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096498.90259: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096498.90457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096498.92657: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096498.92739: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096498.92783: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096498.92824: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096498.92852: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096498.92939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.92980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.93012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.93058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.93081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.93135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.93164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.93275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.93278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.93281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.93302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096498.93329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096498.93356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.93407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096498.93425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096498.93603: variable 'network_connections' from source: task vars 27712 1727096498.93624: variable 'interface1' from source: play vars 27712 1727096498.93722: variable 'interface1' from source: play vars 27712 1727096498.93799: variable 'interface1_mac' from source: set_fact 27712 1727096498.93938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096498.94079: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096498.94118: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096498.94156: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096498.94192: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096498.94237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096498.94273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096498.94302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096498.94330: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096498.94398: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096498.94698: variable 'network_connections' from source: task vars 27712 1727096498.94701: variable 'interface1' from source: play vars 27712 1727096498.94717: variable 'interface1' from source: play vars 27712 1727096498.94794: variable 'interface1_mac' from source: set_fact 27712 1727096498.94838: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27712 1727096498.94846: when evaluation is False, skipping this task 27712 1727096498.94852: _execute() done 27712 1727096498.94858: dumping result to json 27712 1727096498.94864: done dumping result, returning 27712 1727096498.94880: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-cbc7-8716-000000000077] 27712 1727096498.94899: sending task result for task 0afff68d-5257-cbc7-8716-000000000077 27712 1727096498.95138: done sending task result for task 0afff68d-5257-cbc7-8716-000000000077 27712 1727096498.95141: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27712 1727096498.95194: no more pending results, returning what we have 27712 1727096498.95197: results queue empty 27712 1727096498.95198: checking for any_errors_fatal 27712 1727096498.95205: done checking for any_errors_fatal 27712 1727096498.95206: checking for max_fail_percentage 27712 1727096498.95208: done checking for max_fail_percentage 27712 1727096498.95209: checking to see if all hosts have failed and the running result is not ok 27712 1727096498.95210: done checking to see if all hosts have failed 27712 1727096498.95211: getting the remaining hosts for this loop 27712 1727096498.95212: done getting the remaining hosts for this loop 27712 1727096498.95216: getting the next task for host managed_node2 27712 1727096498.95222: done getting next task for host managed_node2 27712 1727096498.95226: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27712 1727096498.95229: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096498.95248: getting variables 27712 1727096498.95249: in VariableManager get_vars() 27712 1727096498.95295: Calling all_inventory to load vars for managed_node2 27712 1727096498.95298: Calling groups_inventory to load vars for managed_node2 27712 1727096498.95301: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096498.95311: Calling all_plugins_play to load vars for managed_node2 27712 1727096498.95314: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096498.95317: Calling groups_plugins_play to load vars for managed_node2 27712 1727096498.96851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096498.98401: done with get_vars() 27712 1727096498.98424: done getting variables 27712 1727096498.98488: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 09:01:38 -0400 (0:00:00.098) 0:00:24.678 ****** 27712 1727096498.98522: entering _queue_task() for managed_node2/service 27712 1727096498.98842: worker is 1 (out of 1 available) 27712 1727096498.98856: exiting _queue_task() for managed_node2/service 27712 1727096498.98875: done queuing things up, now waiting for results queue to drain 27712 1727096498.98876: waiting for pending results... 27712 1727096498.99066: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27712 1727096498.99158: in run() - task 0afff68d-5257-cbc7-8716-000000000078 27712 1727096498.99172: variable 'ansible_search_path' from source: unknown 27712 1727096498.99177: variable 'ansible_search_path' from source: unknown 27712 1727096498.99209: calling self._execute() 27712 1727096498.99288: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096498.99293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096498.99301: variable 'omit' from source: magic vars 27712 1727096498.99568: variable 'ansible_distribution_major_version' from source: facts 27712 1727096498.99580: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096498.99690: variable 'network_provider' from source: set_fact 27712 1727096498.99695: variable 'network_state' from source: role '' defaults 27712 1727096498.99705: Evaluated conditional (network_provider == "nm" or network_state != {}): True 27712 1727096498.99708: variable 'omit' from source: magic vars 27712 1727096498.99748: variable 'omit' from source: magic vars 27712 1727096498.99773: variable 'network_service_name' from source: role '' defaults 27712 1727096498.99823: variable 'network_service_name' from source: role '' defaults 27712 1727096498.99895: variable '__network_provider_setup' from source: role '' defaults 27712 1727096498.99899: variable '__network_service_name_default_nm' from source: role '' defaults 27712 1727096498.99947: variable '__network_service_name_default_nm' from source: role '' defaults 27712 1727096498.99955: variable '__network_packages_default_nm' from source: role '' defaults 27712 1727096499.00002: variable '__network_packages_default_nm' from source: role '' defaults 27712 1727096499.00144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096499.02227: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096499.02272: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096499.02311: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096499.02335: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096499.02355: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096499.02416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096499.02437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096499.02454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096499.02484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096499.02495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096499.02529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096499.02546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096499.02562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096499.02591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096499.02601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096499.02736: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27712 1727096499.02817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096499.02833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096499.02854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096499.02883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096499.02893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096499.02953: variable 'ansible_python' from source: facts 27712 1727096499.02974: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27712 1727096499.03027: variable '__network_wpa_supplicant_required' from source: role '' defaults 27712 1727096499.03084: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27712 1727096499.03164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096499.03187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096499.03203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096499.03229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096499.03239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096499.03274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096499.03323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096499.03326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096499.03334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096499.03349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096499.03477: variable 'network_connections' from source: task vars 27712 1727096499.03481: variable 'interface1' from source: play vars 27712 1727096499.03672: variable 'interface1' from source: play vars 27712 1727096499.03675: variable 'interface1_mac' from source: set_fact 27712 1727096499.03747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096499.03933: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096499.03988: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096499.04034: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096499.04084: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096499.04146: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096499.04184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096499.04221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096499.04260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096499.04314: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096499.04580: variable 'network_connections' from source: task vars 27712 1727096499.04591: variable 'interface1' from source: play vars 27712 1727096499.04664: variable 'interface1' from source: play vars 27712 1727096499.04756: variable 'interface1_mac' from source: set_fact 27712 1727096499.04821: variable '__network_packages_default_wireless' from source: role '' defaults 27712 1727096499.04902: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096499.05177: variable 'network_connections' from source: task vars 27712 1727096499.05189: variable 'interface1' from source: play vars 27712 1727096499.05257: variable 'interface1' from source: play vars 27712 1727096499.05343: variable 'interface1_mac' from source: set_fact 27712 1727096499.05378: variable '__network_packages_default_team' from source: role '' defaults 27712 1727096499.05456: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096499.05740: variable 'network_connections' from source: task vars 27712 1727096499.05875: variable 'interface1' from source: play vars 27712 1727096499.05879: variable 'interface1' from source: play vars 27712 1727096499.05911: variable 'interface1_mac' from source: set_fact 27712 1727096499.05978: variable '__network_service_name_default_initscripts' from source: role '' defaults 27712 1727096499.06039: variable '__network_service_name_default_initscripts' from source: role '' defaults 27712 1727096499.06051: variable '__network_packages_default_initscripts' from source: role '' defaults 27712 1727096499.06115: variable '__network_packages_default_initscripts' from source: role '' defaults 27712 1727096499.06329: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27712 1727096499.06797: variable 'network_connections' from source: task vars 27712 1727096499.06807: variable 'interface1' from source: play vars 27712 1727096499.06869: variable 'interface1' from source: play vars 27712 1727096499.06942: variable 'interface1_mac' from source: set_fact 27712 1727096499.06961: variable 'ansible_distribution' from source: facts 27712 1727096499.06972: variable '__network_rh_distros' from source: role '' defaults 27712 1727096499.06982: variable 'ansible_distribution_major_version' from source: facts 27712 1727096499.07007: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27712 1727096499.07184: variable 'ansible_distribution' from source: facts 27712 1727096499.07193: variable '__network_rh_distros' from source: role '' defaults 27712 1727096499.07202: variable 'ansible_distribution_major_version' from source: facts 27712 1727096499.07219: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27712 1727096499.07404: variable 'ansible_distribution' from source: facts 27712 1727096499.07413: variable '__network_rh_distros' from source: role '' defaults 27712 1727096499.07422: variable 'ansible_distribution_major_version' from source: facts 27712 1727096499.07462: variable 'network_provider' from source: set_fact 27712 1727096499.07672: variable 'omit' from source: magic vars 27712 1727096499.07675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096499.07678: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096499.07680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096499.07682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096499.07684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096499.07686: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096499.07688: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096499.07690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096499.07741: Set connection var ansible_connection to ssh 27712 1727096499.07754: Set connection var ansible_pipelining to False 27712 1727096499.07763: Set connection var ansible_timeout to 10 27712 1727096499.07772: Set connection var ansible_shell_type to sh 27712 1727096499.07784: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096499.07792: Set connection var ansible_shell_executable to /bin/sh 27712 1727096499.07817: variable 'ansible_shell_executable' from source: unknown 27712 1727096499.07824: variable 'ansible_connection' from source: unknown 27712 1727096499.07830: variable 'ansible_module_compression' from source: unknown 27712 1727096499.07836: variable 'ansible_shell_type' from source: unknown 27712 1727096499.07841: variable 'ansible_shell_executable' from source: unknown 27712 1727096499.07848: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096499.07855: variable 'ansible_pipelining' from source: unknown 27712 1727096499.07861: variable 'ansible_timeout' from source: unknown 27712 1727096499.07872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096499.07970: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096499.07989: variable 'omit' from source: magic vars 27712 1727096499.07999: starting attempt loop 27712 1727096499.08006: running the handler 27712 1727096499.08082: variable 'ansible_facts' from source: unknown 27712 1727096499.08732: _low_level_execute_command(): starting 27712 1727096499.08744: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096499.09402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096499.09417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096499.09433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096499.09453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096499.09472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096499.09564: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096499.09595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096499.09610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096499.09681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096499.11354: stdout chunk (state=3): >>>/root <<< 27712 1727096499.11488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096499.11499: stdout chunk (state=3): >>><<< 27712 1727096499.11518: stderr chunk (state=3): >>><<< 27712 1727096499.11539: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096499.11556: _low_level_execute_command(): starting 27712 1727096499.11569: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096499.1154501-28896-240636992110224 `" && echo ansible-tmp-1727096499.1154501-28896-240636992110224="` echo /root/.ansible/tmp/ansible-tmp-1727096499.1154501-28896-240636992110224 `" ) && sleep 0' 27712 1727096499.12137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096499.12153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096499.12173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096499.12192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096499.12210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096499.12225: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096499.12329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096499.12344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096499.12412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096499.14334: stdout chunk (state=3): >>>ansible-tmp-1727096499.1154501-28896-240636992110224=/root/.ansible/tmp/ansible-tmp-1727096499.1154501-28896-240636992110224 <<< 27712 1727096499.14488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096499.14499: stdout chunk (state=3): >>><<< 27712 1727096499.14513: stderr chunk (state=3): >>><<< 27712 1727096499.14539: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096499.1154501-28896-240636992110224=/root/.ansible/tmp/ansible-tmp-1727096499.1154501-28896-240636992110224 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096499.14582: variable 'ansible_module_compression' from source: unknown 27712 1727096499.14641: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 27712 1727096499.14723: variable 'ansible_facts' from source: unknown 27712 1727096499.15147: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096499.1154501-28896-240636992110224/AnsiballZ_systemd.py 27712 1727096499.15390: Sending initial data 27712 1727096499.15399: Sent initial data (156 bytes) 27712 1727096499.16061: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096499.16083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096499.16136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096499.16206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096499.16223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096499.16258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096499.16379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096499.17956: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096499.18022: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096499.18078: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmp4a7c8s0q /root/.ansible/tmp/ansible-tmp-1727096499.1154501-28896-240636992110224/AnsiballZ_systemd.py <<< 27712 1727096499.18082: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096499.1154501-28896-240636992110224/AnsiballZ_systemd.py" <<< 27712 1727096499.18122: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmp4a7c8s0q" to remote "/root/.ansible/tmp/ansible-tmp-1727096499.1154501-28896-240636992110224/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096499.1154501-28896-240636992110224/AnsiballZ_systemd.py" <<< 27712 1727096499.20909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096499.21078: stdout chunk (state=3): >>><<< 27712 1727096499.21082: stderr chunk (state=3): >>><<< 27712 1727096499.21084: done transferring module to remote 27712 1727096499.21086: _low_level_execute_command(): starting 27712 1727096499.21089: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096499.1154501-28896-240636992110224/ /root/.ansible/tmp/ansible-tmp-1727096499.1154501-28896-240636992110224/AnsiballZ_systemd.py && sleep 0' 27712 1727096499.21834: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096499.21877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096499.21892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096499.22113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096499.23923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096499.23954: stderr chunk (state=3): >>><<< 27712 1727096499.23957: stdout chunk (state=3): >>><<< 27712 1727096499.23973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096499.24030: _low_level_execute_command(): starting 27712 1727096499.24034: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096499.1154501-28896-240636992110224/AnsiballZ_systemd.py && sleep 0' 27712 1727096499.25065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096499.25084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096499.25122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096499.25141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096499.25314: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096499.25360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096499.25463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096499.25522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096499.55075: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4661248", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304087552", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1698952000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "syst<<< 27712 1727096499.55101: stdout chunk (state=3): >>>emd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 27712 1727096499.56986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096499.57005: stderr chunk (state=3): >>>Shared connection to 10.31.15.126 closed. <<< 27712 1727096499.57063: stderr chunk (state=3): >>><<< 27712 1727096499.57082: stdout chunk (state=3): >>><<< 27712 1727096499.57118: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4661248", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304087552", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1698952000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096499.57357: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096499.1154501-28896-240636992110224/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096499.57392: _low_level_execute_command(): starting 27712 1727096499.57404: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096499.1154501-28896-240636992110224/ > /dev/null 2>&1 && sleep 0' 27712 1727096499.58049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096499.58063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096499.58082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096499.58098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096499.58112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096499.58133: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096499.58185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096499.58248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096499.58273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096499.58330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096499.60232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096499.60235: stdout chunk (state=3): >>><<< 27712 1727096499.60237: stderr chunk (state=3): >>><<< 27712 1727096499.60250: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096499.60317: handler run complete 27712 1727096499.60339: attempt loop complete, returning result 27712 1727096499.60346: _execute() done 27712 1727096499.60352: dumping result to json 27712 1727096499.60389: done dumping result, returning 27712 1727096499.60402: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-cbc7-8716-000000000078] 27712 1727096499.60410: sending task result for task 0afff68d-5257-cbc7-8716-000000000078 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096499.60936: no more pending results, returning what we have 27712 1727096499.60940: results queue empty 27712 1727096499.60941: checking for any_errors_fatal 27712 1727096499.60949: done checking for any_errors_fatal 27712 1727096499.60950: checking for max_fail_percentage 27712 1727096499.60952: done checking for max_fail_percentage 27712 1727096499.60953: checking to see if all hosts have failed and the running result is not ok 27712 1727096499.60954: done checking to see if all hosts have failed 27712 1727096499.60954: getting the remaining hosts for this loop 27712 1727096499.60956: done getting the remaining hosts for this loop 27712 1727096499.60959: getting the next task for host managed_node2 27712 1727096499.60965: done getting next task for host managed_node2 27712 1727096499.60973: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27712 1727096499.60976: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096499.60989: getting variables 27712 1727096499.60991: in VariableManager get_vars() 27712 1727096499.61028: Calling all_inventory to load vars for managed_node2 27712 1727096499.61032: Calling groups_inventory to load vars for managed_node2 27712 1727096499.61035: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096499.61047: Calling all_plugins_play to load vars for managed_node2 27712 1727096499.61051: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096499.61055: Calling groups_plugins_play to load vars for managed_node2 27712 1727096499.61581: done sending task result for task 0afff68d-5257-cbc7-8716-000000000078 27712 1727096499.61585: WORKER PROCESS EXITING 27712 1727096499.62840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096499.64465: done with get_vars() 27712 1727096499.64495: done getting variables 27712 1727096499.64549: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 09:01:39 -0400 (0:00:00.660) 0:00:25.339 ****** 27712 1727096499.64584: entering _queue_task() for managed_node2/service 27712 1727096499.65080: worker is 1 (out of 1 available) 27712 1727096499.65091: exiting _queue_task() for managed_node2/service 27712 1727096499.65101: done queuing things up, now waiting for results queue to drain 27712 1727096499.65102: waiting for pending results... 27712 1727096499.65184: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27712 1727096499.65332: in run() - task 0afff68d-5257-cbc7-8716-000000000079 27712 1727096499.65354: variable 'ansible_search_path' from source: unknown 27712 1727096499.65362: variable 'ansible_search_path' from source: unknown 27712 1727096499.65405: calling self._execute() 27712 1727096499.65512: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096499.65523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096499.65535: variable 'omit' from source: magic vars 27712 1727096499.65926: variable 'ansible_distribution_major_version' from source: facts 27712 1727096499.65941: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096499.66066: variable 'network_provider' from source: set_fact 27712 1727096499.66094: Evaluated conditional (network_provider == "nm"): True 27712 1727096499.66186: variable '__network_wpa_supplicant_required' from source: role '' defaults 27712 1727096499.66284: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27712 1727096499.66475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096499.68604: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096499.68607: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096499.68637: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096499.68676: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096499.68703: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096499.68794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096499.68823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096499.68849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096499.68894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096499.68909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096499.68956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096499.68985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096499.69060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096499.69063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096499.69066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096499.69107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096499.69129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096499.69158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096499.69195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096499.69210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096499.69369: variable 'network_connections' from source: task vars 27712 1727096499.69375: variable 'interface1' from source: play vars 27712 1727096499.69503: variable 'interface1' from source: play vars 27712 1727096499.69506: variable 'interface1_mac' from source: set_fact 27712 1727096499.69586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096499.69749: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096499.69787: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096499.69818: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096499.69851: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096499.69893: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096499.69913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096499.69936: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096499.69965: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096499.70036: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096499.70255: variable 'network_connections' from source: task vars 27712 1727096499.70261: variable 'interface1' from source: play vars 27712 1727096499.70396: variable 'interface1' from source: play vars 27712 1727096499.70400: variable 'interface1_mac' from source: set_fact 27712 1727096499.70439: Evaluated conditional (__network_wpa_supplicant_required): False 27712 1727096499.70443: when evaluation is False, skipping this task 27712 1727096499.70445: _execute() done 27712 1727096499.70448: dumping result to json 27712 1727096499.70450: done dumping result, returning 27712 1727096499.70459: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-cbc7-8716-000000000079] 27712 1727096499.70462: sending task result for task 0afff68d-5257-cbc7-8716-000000000079 27712 1727096499.70583: done sending task result for task 0afff68d-5257-cbc7-8716-000000000079 27712 1727096499.70586: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 27712 1727096499.70668: no more pending results, returning what we have 27712 1727096499.70709: results queue empty 27712 1727096499.70710: checking for any_errors_fatal 27712 1727096499.70726: done checking for any_errors_fatal 27712 1727096499.70727: checking for max_fail_percentage 27712 1727096499.70729: done checking for max_fail_percentage 27712 1727096499.70729: checking to see if all hosts have failed and the running result is not ok 27712 1727096499.70730: done checking to see if all hosts have failed 27712 1727096499.70731: getting the remaining hosts for this loop 27712 1727096499.70732: done getting the remaining hosts for this loop 27712 1727096499.70735: getting the next task for host managed_node2 27712 1727096499.70740: done getting next task for host managed_node2 27712 1727096499.70743: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 27712 1727096499.70745: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096499.70760: getting variables 27712 1727096499.70761: in VariableManager get_vars() 27712 1727096499.70800: Calling all_inventory to load vars for managed_node2 27712 1727096499.70802: Calling groups_inventory to load vars for managed_node2 27712 1727096499.70804: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096499.70822: Calling all_plugins_play to load vars for managed_node2 27712 1727096499.70826: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096499.70830: Calling groups_plugins_play to load vars for managed_node2 27712 1727096499.71582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096499.72692: done with get_vars() 27712 1727096499.72714: done getting variables 27712 1727096499.72783: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 09:01:39 -0400 (0:00:00.082) 0:00:25.421 ****** 27712 1727096499.72814: entering _queue_task() for managed_node2/service 27712 1727096499.73137: worker is 1 (out of 1 available) 27712 1727096499.73148: exiting _queue_task() for managed_node2/service 27712 1727096499.73159: done queuing things up, now waiting for results queue to drain 27712 1727096499.73160: waiting for pending results... 27712 1727096499.73449: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 27712 1727096499.73542: in run() - task 0afff68d-5257-cbc7-8716-00000000007a 27712 1727096499.73551: variable 'ansible_search_path' from source: unknown 27712 1727096499.73555: variable 'ansible_search_path' from source: unknown 27712 1727096499.73589: calling self._execute() 27712 1727096499.73661: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096499.73666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096499.73680: variable 'omit' from source: magic vars 27712 1727096499.73954: variable 'ansible_distribution_major_version' from source: facts 27712 1727096499.73963: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096499.74046: variable 'network_provider' from source: set_fact 27712 1727096499.74050: Evaluated conditional (network_provider == "initscripts"): False 27712 1727096499.74053: when evaluation is False, skipping this task 27712 1727096499.74055: _execute() done 27712 1727096499.74058: dumping result to json 27712 1727096499.74061: done dumping result, returning 27712 1727096499.74070: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-cbc7-8716-00000000007a] 27712 1727096499.74078: sending task result for task 0afff68d-5257-cbc7-8716-00000000007a 27712 1727096499.74158: done sending task result for task 0afff68d-5257-cbc7-8716-00000000007a 27712 1727096499.74160: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096499.74207: no more pending results, returning what we have 27712 1727096499.74210: results queue empty 27712 1727096499.74211: checking for any_errors_fatal 27712 1727096499.74222: done checking for any_errors_fatal 27712 1727096499.74222: checking for max_fail_percentage 27712 1727096499.74224: done checking for max_fail_percentage 27712 1727096499.74224: checking to see if all hosts have failed and the running result is not ok 27712 1727096499.74225: done checking to see if all hosts have failed 27712 1727096499.74226: getting the remaining hosts for this loop 27712 1727096499.74227: done getting the remaining hosts for this loop 27712 1727096499.74230: getting the next task for host managed_node2 27712 1727096499.74236: done getting next task for host managed_node2 27712 1727096499.74239: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27712 1727096499.74242: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096499.74261: getting variables 27712 1727096499.74262: in VariableManager get_vars() 27712 1727096499.74296: Calling all_inventory to load vars for managed_node2 27712 1727096499.74299: Calling groups_inventory to load vars for managed_node2 27712 1727096499.74301: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096499.74309: Calling all_plugins_play to load vars for managed_node2 27712 1727096499.74311: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096499.74314: Calling groups_plugins_play to load vars for managed_node2 27712 1727096499.75042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096499.76438: done with get_vars() 27712 1727096499.76458: done getting variables 27712 1727096499.76516: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 09:01:39 -0400 (0:00:00.037) 0:00:25.458 ****** 27712 1727096499.76549: entering _queue_task() for managed_node2/copy 27712 1727096499.76838: worker is 1 (out of 1 available) 27712 1727096499.76852: exiting _queue_task() for managed_node2/copy 27712 1727096499.76865: done queuing things up, now waiting for results queue to drain 27712 1727096499.76867: waiting for pending results... 27712 1727096499.77054: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27712 1727096499.77142: in run() - task 0afff68d-5257-cbc7-8716-00000000007b 27712 1727096499.77154: variable 'ansible_search_path' from source: unknown 27712 1727096499.77157: variable 'ansible_search_path' from source: unknown 27712 1727096499.77188: calling self._execute() 27712 1727096499.77260: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096499.77264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096499.77276: variable 'omit' from source: magic vars 27712 1727096499.77548: variable 'ansible_distribution_major_version' from source: facts 27712 1727096499.77558: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096499.77638: variable 'network_provider' from source: set_fact 27712 1727096499.77642: Evaluated conditional (network_provider == "initscripts"): False 27712 1727096499.77645: when evaluation is False, skipping this task 27712 1727096499.77648: _execute() done 27712 1727096499.77650: dumping result to json 27712 1727096499.77653: done dumping result, returning 27712 1727096499.77661: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-cbc7-8716-00000000007b] 27712 1727096499.77664: sending task result for task 0afff68d-5257-cbc7-8716-00000000007b 27712 1727096499.77752: done sending task result for task 0afff68d-5257-cbc7-8716-00000000007b 27712 1727096499.77755: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 27712 1727096499.77805: no more pending results, returning what we have 27712 1727096499.77808: results queue empty 27712 1727096499.77809: checking for any_errors_fatal 27712 1727096499.77813: done checking for any_errors_fatal 27712 1727096499.77814: checking for max_fail_percentage 27712 1727096499.77815: done checking for max_fail_percentage 27712 1727096499.77816: checking to see if all hosts have failed and the running result is not ok 27712 1727096499.77817: done checking to see if all hosts have failed 27712 1727096499.77817: getting the remaining hosts for this loop 27712 1727096499.77818: done getting the remaining hosts for this loop 27712 1727096499.77821: getting the next task for host managed_node2 27712 1727096499.77826: done getting next task for host managed_node2 27712 1727096499.77829: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27712 1727096499.77832: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096499.77848: getting variables 27712 1727096499.77849: in VariableManager get_vars() 27712 1727096499.77890: Calling all_inventory to load vars for managed_node2 27712 1727096499.77893: Calling groups_inventory to load vars for managed_node2 27712 1727096499.77895: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096499.77903: Calling all_plugins_play to load vars for managed_node2 27712 1727096499.77905: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096499.77908: Calling groups_plugins_play to load vars for managed_node2 27712 1727096499.79064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096499.80496: done with get_vars() 27712 1727096499.80516: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 09:01:39 -0400 (0:00:00.040) 0:00:25.499 ****** 27712 1727096499.80599: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 27712 1727096499.80906: worker is 1 (out of 1 available) 27712 1727096499.80922: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 27712 1727096499.80933: done queuing things up, now waiting for results queue to drain 27712 1727096499.80935: waiting for pending results... 27712 1727096499.81125: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27712 1727096499.81218: in run() - task 0afff68d-5257-cbc7-8716-00000000007c 27712 1727096499.81230: variable 'ansible_search_path' from source: unknown 27712 1727096499.81234: variable 'ansible_search_path' from source: unknown 27712 1727096499.81261: calling self._execute() 27712 1727096499.81339: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096499.81343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096499.81353: variable 'omit' from source: magic vars 27712 1727096499.81632: variable 'ansible_distribution_major_version' from source: facts 27712 1727096499.81642: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096499.81648: variable 'omit' from source: magic vars 27712 1727096499.81690: variable 'omit' from source: magic vars 27712 1727096499.81805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096499.83573: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096499.83618: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096499.83628: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096499.83658: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096499.83685: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096499.83741: variable 'network_provider' from source: set_fact 27712 1727096499.83840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096499.83879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096499.83897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096499.83924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096499.83936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096499.83992: variable 'omit' from source: magic vars 27712 1727096499.84067: variable 'omit' from source: magic vars 27712 1727096499.84141: variable 'network_connections' from source: task vars 27712 1727096499.84150: variable 'interface1' from source: play vars 27712 1727096499.84203: variable 'interface1' from source: play vars 27712 1727096499.84255: variable 'interface1_mac' from source: set_fact 27712 1727096499.84384: variable 'omit' from source: magic vars 27712 1727096499.84391: variable '__lsr_ansible_managed' from source: task vars 27712 1727096499.84437: variable '__lsr_ansible_managed' from source: task vars 27712 1727096499.84617: Loaded config def from plugin (lookup/template) 27712 1727096499.84621: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 27712 1727096499.84642: File lookup term: get_ansible_managed.j2 27712 1727096499.84645: variable 'ansible_search_path' from source: unknown 27712 1727096499.84649: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 27712 1727096499.84660: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 27712 1727096499.84677: variable 'ansible_search_path' from source: unknown 27712 1727096499.88576: variable 'ansible_managed' from source: unknown 27712 1727096499.88580: variable 'omit' from source: magic vars 27712 1727096499.88583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096499.88585: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096499.88588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096499.88598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096499.88615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096499.88649: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096499.88659: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096499.88673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096499.88774: Set connection var ansible_connection to ssh 27712 1727096499.88790: Set connection var ansible_pipelining to False 27712 1727096499.88799: Set connection var ansible_timeout to 10 27712 1727096499.88807: Set connection var ansible_shell_type to sh 27712 1727096499.88818: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096499.88828: Set connection var ansible_shell_executable to /bin/sh 27712 1727096499.88857: variable 'ansible_shell_executable' from source: unknown 27712 1727096499.88865: variable 'ansible_connection' from source: unknown 27712 1727096499.88883: variable 'ansible_module_compression' from source: unknown 27712 1727096499.88891: variable 'ansible_shell_type' from source: unknown 27712 1727096499.88903: variable 'ansible_shell_executable' from source: unknown 27712 1727096499.88907: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096499.88909: variable 'ansible_pipelining' from source: unknown 27712 1727096499.88911: variable 'ansible_timeout' from source: unknown 27712 1727096499.88915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096499.89037: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096499.89057: variable 'omit' from source: magic vars 27712 1727096499.89060: starting attempt loop 27712 1727096499.89062: running the handler 27712 1727096499.89079: _low_level_execute_command(): starting 27712 1727096499.89082: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096499.89552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096499.89555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096499.89558: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096499.89560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096499.89617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096499.89620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096499.89663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096499.91701: stdout chunk (state=3): >>>/root <<< 27712 1727096499.91705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096499.91706: stdout chunk (state=3): >>><<< 27712 1727096499.91708: stderr chunk (state=3): >>><<< 27712 1727096499.91710: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096499.91712: _low_level_execute_command(): starting 27712 1727096499.91715: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096499.9161706-28936-183166099109055 `" && echo ansible-tmp-1727096499.9161706-28936-183166099109055="` echo /root/.ansible/tmp/ansible-tmp-1727096499.9161706-28936-183166099109055 `" ) && sleep 0' 27712 1727096499.92278: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096499.92294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096499.92315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096499.92331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096499.92347: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096499.92425: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096499.92458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096499.92487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096499.92503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096499.92573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096499.94507: stdout chunk (state=3): >>>ansible-tmp-1727096499.9161706-28936-183166099109055=/root/.ansible/tmp/ansible-tmp-1727096499.9161706-28936-183166099109055 <<< 27712 1727096499.94659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096499.94696: stdout chunk (state=3): >>><<< 27712 1727096499.94699: stderr chunk (state=3): >>><<< 27712 1727096499.94876: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096499.9161706-28936-183166099109055=/root/.ansible/tmp/ansible-tmp-1727096499.9161706-28936-183166099109055 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096499.94881: variable 'ansible_module_compression' from source: unknown 27712 1727096499.94883: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 27712 1727096499.94885: variable 'ansible_facts' from source: unknown 27712 1727096499.94987: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096499.9161706-28936-183166099109055/AnsiballZ_network_connections.py 27712 1727096499.95133: Sending initial data 27712 1727096499.95236: Sent initial data (168 bytes) 27712 1727096499.95844: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096499.95862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096499.95898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096499.95917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096499.96006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096499.96053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096499.96130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096499.97777: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096499.97830: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096499.97900: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmp1ka6gh_1 /root/.ansible/tmp/ansible-tmp-1727096499.9161706-28936-183166099109055/AnsiballZ_network_connections.py <<< 27712 1727096499.97925: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096499.9161706-28936-183166099109055/AnsiballZ_network_connections.py" <<< 27712 1727096499.97955: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmp1ka6gh_1" to remote "/root/.ansible/tmp/ansible-tmp-1727096499.9161706-28936-183166099109055/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096499.9161706-28936-183166099109055/AnsiballZ_network_connections.py" <<< 27712 1727096499.99130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096499.99133: stdout chunk (state=3): >>><<< 27712 1727096499.99135: stderr chunk (state=3): >>><<< 27712 1727096499.99137: done transferring module to remote 27712 1727096499.99138: _low_level_execute_command(): starting 27712 1727096499.99140: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096499.9161706-28936-183166099109055/ /root/.ansible/tmp/ansible-tmp-1727096499.9161706-28936-183166099109055/AnsiballZ_network_connections.py && sleep 0' 27712 1727096499.99866: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096499.99872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096499.99875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096499.99877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096499.99879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096499.99936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096499.99948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096499.99996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096500.01959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096500.01974: stdout chunk (state=3): >>><<< 27712 1727096500.01989: stderr chunk (state=3): >>><<< 27712 1727096500.02011: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096500.02019: _low_level_execute_command(): starting 27712 1727096500.02029: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096499.9161706-28936-183166099109055/AnsiballZ_network_connections.py && sleep 0' 27712 1727096500.02664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096500.02682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096500.02704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096500.02724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096500.02790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096500.02849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096500.02871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096500.02901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096500.02980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096500.31243: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "da:ee:22:eb:6c:c1", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "da:ee:22:eb:6c:c1", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 27712 1727096500.32981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096500.33001: stderr chunk (state=3): >>><<< 27712 1727096500.33006: stdout chunk (state=3): >>><<< 27712 1727096500.33028: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "da:ee:22:eb:6c:c1", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "da:ee:22:eb:6c:c1", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096500.33062: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest1', 'mac': 'da:ee:22:eb:6c:c1', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.4/24', '2001:db8::6/32'], 'route': [{'network': '198.58.10.64', 'prefix': 26, 'gateway': '198.51.100.102', 'metric': 4}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096499.9161706-28936-183166099109055/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096500.33073: _low_level_execute_command(): starting 27712 1727096500.33076: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096499.9161706-28936-183166099109055/ > /dev/null 2>&1 && sleep 0' 27712 1727096500.33544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096500.33571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096500.33604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096500.33661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096500.35478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096500.35501: stderr chunk (state=3): >>><<< 27712 1727096500.35504: stdout chunk (state=3): >>><<< 27712 1727096500.35516: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096500.35523: handler run complete 27712 1727096500.35547: attempt loop complete, returning result 27712 1727096500.35552: _execute() done 27712 1727096500.35554: dumping result to json 27712 1727096500.35565: done dumping result, returning 27712 1727096500.35574: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-cbc7-8716-00000000007c] 27712 1727096500.35577: sending task result for task 0afff68d-5257-cbc7-8716-00000000007c 27712 1727096500.35683: done sending task result for task 0afff68d-5257-cbc7-8716-00000000007c 27712 1727096500.35685: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "address": [ "198.51.100.4/24", "2001:db8::6/32" ], "route": [ { "gateway": "198.51.100.102", "metric": 4, "network": "198.58.10.64", "prefix": 26 } ] }, "mac": "da:ee:22:eb:6c:c1", "name": "ethtest1", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa 27712 1727096500.35796: no more pending results, returning what we have 27712 1727096500.35799: results queue empty 27712 1727096500.35800: checking for any_errors_fatal 27712 1727096500.35808: done checking for any_errors_fatal 27712 1727096500.35809: checking for max_fail_percentage 27712 1727096500.35810: done checking for max_fail_percentage 27712 1727096500.35811: checking to see if all hosts have failed and the running result is not ok 27712 1727096500.35812: done checking to see if all hosts have failed 27712 1727096500.35812: getting the remaining hosts for this loop 27712 1727096500.35814: done getting the remaining hosts for this loop 27712 1727096500.35817: getting the next task for host managed_node2 27712 1727096500.35822: done getting next task for host managed_node2 27712 1727096500.35825: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 27712 1727096500.35828: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096500.35837: getting variables 27712 1727096500.35838: in VariableManager get_vars() 27712 1727096500.35885: Calling all_inventory to load vars for managed_node2 27712 1727096500.35888: Calling groups_inventory to load vars for managed_node2 27712 1727096500.35889: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096500.35898: Calling all_plugins_play to load vars for managed_node2 27712 1727096500.35901: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096500.35904: Calling groups_plugins_play to load vars for managed_node2 27712 1727096500.36701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096500.37566: done with get_vars() 27712 1727096500.37586: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 09:01:40 -0400 (0:00:00.570) 0:00:26.069 ****** 27712 1727096500.37646: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 27712 1727096500.37858: worker is 1 (out of 1 available) 27712 1727096500.37873: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 27712 1727096500.37884: done queuing things up, now waiting for results queue to drain 27712 1727096500.37886: waiting for pending results... 27712 1727096500.38056: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 27712 1727096500.38150: in run() - task 0afff68d-5257-cbc7-8716-00000000007d 27712 1727096500.38162: variable 'ansible_search_path' from source: unknown 27712 1727096500.38165: variable 'ansible_search_path' from source: unknown 27712 1727096500.38196: calling self._execute() 27712 1727096500.38274: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096500.38278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096500.38285: variable 'omit' from source: magic vars 27712 1727096500.38548: variable 'ansible_distribution_major_version' from source: facts 27712 1727096500.38565: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096500.38649: variable 'network_state' from source: role '' defaults 27712 1727096500.38661: Evaluated conditional (network_state != {}): False 27712 1727096500.38665: when evaluation is False, skipping this task 27712 1727096500.38669: _execute() done 27712 1727096500.38672: dumping result to json 27712 1727096500.38677: done dumping result, returning 27712 1727096500.38684: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-cbc7-8716-00000000007d] 27712 1727096500.38689: sending task result for task 0afff68d-5257-cbc7-8716-00000000007d 27712 1727096500.38766: done sending task result for task 0afff68d-5257-cbc7-8716-00000000007d 27712 1727096500.38771: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096500.38820: no more pending results, returning what we have 27712 1727096500.38823: results queue empty 27712 1727096500.38824: checking for any_errors_fatal 27712 1727096500.38835: done checking for any_errors_fatal 27712 1727096500.38835: checking for max_fail_percentage 27712 1727096500.38837: done checking for max_fail_percentage 27712 1727096500.38838: checking to see if all hosts have failed and the running result is not ok 27712 1727096500.38839: done checking to see if all hosts have failed 27712 1727096500.38839: getting the remaining hosts for this loop 27712 1727096500.38841: done getting the remaining hosts for this loop 27712 1727096500.38844: getting the next task for host managed_node2 27712 1727096500.38849: done getting next task for host managed_node2 27712 1727096500.38852: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27712 1727096500.38855: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096500.38870: getting variables 27712 1727096500.38872: in VariableManager get_vars() 27712 1727096500.38903: Calling all_inventory to load vars for managed_node2 27712 1727096500.38905: Calling groups_inventory to load vars for managed_node2 27712 1727096500.38907: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096500.38915: Calling all_plugins_play to load vars for managed_node2 27712 1727096500.38917: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096500.38920: Calling groups_plugins_play to load vars for managed_node2 27712 1727096500.39906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096500.41396: done with get_vars() 27712 1727096500.41417: done getting variables 27712 1727096500.41475: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 09:01:40 -0400 (0:00:00.038) 0:00:26.108 ****** 27712 1727096500.41507: entering _queue_task() for managed_node2/debug 27712 1727096500.41756: worker is 1 (out of 1 available) 27712 1727096500.41973: exiting _queue_task() for managed_node2/debug 27712 1727096500.41984: done queuing things up, now waiting for results queue to drain 27712 1727096500.41985: waiting for pending results... 27712 1727096500.42113: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27712 1727096500.42208: in run() - task 0afff68d-5257-cbc7-8716-00000000007e 27712 1727096500.42235: variable 'ansible_search_path' from source: unknown 27712 1727096500.42327: variable 'ansible_search_path' from source: unknown 27712 1727096500.42331: calling self._execute() 27712 1727096500.42397: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096500.42411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096500.42431: variable 'omit' from source: magic vars 27712 1727096500.42802: variable 'ansible_distribution_major_version' from source: facts 27712 1727096500.42819: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096500.42832: variable 'omit' from source: magic vars 27712 1727096500.42900: variable 'omit' from source: magic vars 27712 1727096500.42942: variable 'omit' from source: magic vars 27712 1727096500.42991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096500.43032: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096500.43061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096500.43173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096500.43177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096500.43180: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096500.43182: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096500.43185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096500.43269: Set connection var ansible_connection to ssh 27712 1727096500.43287: Set connection var ansible_pipelining to False 27712 1727096500.43300: Set connection var ansible_timeout to 10 27712 1727096500.43314: Set connection var ansible_shell_type to sh 27712 1727096500.43329: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096500.43340: Set connection var ansible_shell_executable to /bin/sh 27712 1727096500.43370: variable 'ansible_shell_executable' from source: unknown 27712 1727096500.43380: variable 'ansible_connection' from source: unknown 27712 1727096500.43390: variable 'ansible_module_compression' from source: unknown 27712 1727096500.43398: variable 'ansible_shell_type' from source: unknown 27712 1727096500.43406: variable 'ansible_shell_executable' from source: unknown 27712 1727096500.43419: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096500.43429: variable 'ansible_pipelining' from source: unknown 27712 1727096500.43527: variable 'ansible_timeout' from source: unknown 27712 1727096500.43531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096500.43603: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096500.43621: variable 'omit' from source: magic vars 27712 1727096500.43639: starting attempt loop 27712 1727096500.43647: running the handler 27712 1727096500.43785: variable '__network_connections_result' from source: set_fact 27712 1727096500.43845: handler run complete 27712 1727096500.43879: attempt loop complete, returning result 27712 1727096500.43888: _execute() done 27712 1727096500.43897: dumping result to json 27712 1727096500.43906: done dumping result, returning 27712 1727096500.43923: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-cbc7-8716-00000000007e] 27712 1727096500.43934: sending task result for task 0afff68d-5257-cbc7-8716-00000000007e ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa" ] } 27712 1727096500.44129: no more pending results, returning what we have 27712 1727096500.44132: results queue empty 27712 1727096500.44133: checking for any_errors_fatal 27712 1727096500.44139: done checking for any_errors_fatal 27712 1727096500.44140: checking for max_fail_percentage 27712 1727096500.44142: done checking for max_fail_percentage 27712 1727096500.44143: checking to see if all hosts have failed and the running result is not ok 27712 1727096500.44144: done checking to see if all hosts have failed 27712 1727096500.44144: getting the remaining hosts for this loop 27712 1727096500.44146: done getting the remaining hosts for this loop 27712 1727096500.44149: getting the next task for host managed_node2 27712 1727096500.44155: done getting next task for host managed_node2 27712 1727096500.44158: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27712 1727096500.44161: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096500.44173: getting variables 27712 1727096500.44175: in VariableManager get_vars() 27712 1727096500.44210: Calling all_inventory to load vars for managed_node2 27712 1727096500.44213: Calling groups_inventory to load vars for managed_node2 27712 1727096500.44215: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096500.44223: Calling all_plugins_play to load vars for managed_node2 27712 1727096500.44226: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096500.44228: Calling groups_plugins_play to load vars for managed_node2 27712 1727096500.44781: done sending task result for task 0afff68d-5257-cbc7-8716-00000000007e 27712 1727096500.44784: WORKER PROCESS EXITING 27712 1727096500.44989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096500.45922: done with get_vars() 27712 1727096500.45937: done getting variables 27712 1727096500.45977: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 09:01:40 -0400 (0:00:00.044) 0:00:26.153 ****** 27712 1727096500.45998: entering _queue_task() for managed_node2/debug 27712 1727096500.46183: worker is 1 (out of 1 available) 27712 1727096500.46196: exiting _queue_task() for managed_node2/debug 27712 1727096500.46208: done queuing things up, now waiting for results queue to drain 27712 1727096500.46209: waiting for pending results... 27712 1727096500.46383: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27712 1727096500.46468: in run() - task 0afff68d-5257-cbc7-8716-00000000007f 27712 1727096500.46484: variable 'ansible_search_path' from source: unknown 27712 1727096500.46488: variable 'ansible_search_path' from source: unknown 27712 1727096500.46512: calling self._execute() 27712 1727096500.46583: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096500.46587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096500.46597: variable 'omit' from source: magic vars 27712 1727096500.46855: variable 'ansible_distribution_major_version' from source: facts 27712 1727096500.46866: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096500.46873: variable 'omit' from source: magic vars 27712 1727096500.46911: variable 'omit' from source: magic vars 27712 1727096500.46939: variable 'omit' from source: magic vars 27712 1727096500.46970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096500.46998: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096500.47013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096500.47031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096500.47038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096500.47060: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096500.47063: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096500.47066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096500.47134: Set connection var ansible_connection to ssh 27712 1727096500.47144: Set connection var ansible_pipelining to False 27712 1727096500.47147: Set connection var ansible_timeout to 10 27712 1727096500.47149: Set connection var ansible_shell_type to sh 27712 1727096500.47155: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096500.47159: Set connection var ansible_shell_executable to /bin/sh 27712 1727096500.47180: variable 'ansible_shell_executable' from source: unknown 27712 1727096500.47183: variable 'ansible_connection' from source: unknown 27712 1727096500.47185: variable 'ansible_module_compression' from source: unknown 27712 1727096500.47188: variable 'ansible_shell_type' from source: unknown 27712 1727096500.47190: variable 'ansible_shell_executable' from source: unknown 27712 1727096500.47194: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096500.47196: variable 'ansible_pipelining' from source: unknown 27712 1727096500.47199: variable 'ansible_timeout' from source: unknown 27712 1727096500.47201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096500.47299: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096500.47307: variable 'omit' from source: magic vars 27712 1727096500.47311: starting attempt loop 27712 1727096500.47314: running the handler 27712 1727096500.47352: variable '__network_connections_result' from source: set_fact 27712 1727096500.47409: variable '__network_connections_result' from source: set_fact 27712 1727096500.47503: handler run complete 27712 1727096500.47522: attempt loop complete, returning result 27712 1727096500.47525: _execute() done 27712 1727096500.47528: dumping result to json 27712 1727096500.47536: done dumping result, returning 27712 1727096500.47541: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-cbc7-8716-00000000007f] 27712 1727096500.47546: sending task result for task 0afff68d-5257-cbc7-8716-00000000007f 27712 1727096500.47629: done sending task result for task 0afff68d-5257-cbc7-8716-00000000007f 27712 1727096500.47631: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "address": [ "198.51.100.4/24", "2001:db8::6/32" ], "route": [ { "gateway": "198.51.100.102", "metric": 4, "network": "198.58.10.64", "prefix": 26 } ] }, "mac": "da:ee:22:eb:6c:c1", "name": "ethtest1", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 82850e40-6086-456d-9524-f262534a63fa" ] } } 27712 1727096500.47727: no more pending results, returning what we have 27712 1727096500.47730: results queue empty 27712 1727096500.47731: checking for any_errors_fatal 27712 1727096500.47735: done checking for any_errors_fatal 27712 1727096500.47736: checking for max_fail_percentage 27712 1727096500.47738: done checking for max_fail_percentage 27712 1727096500.47738: checking to see if all hosts have failed and the running result is not ok 27712 1727096500.47739: done checking to see if all hosts have failed 27712 1727096500.47740: getting the remaining hosts for this loop 27712 1727096500.47743: done getting the remaining hosts for this loop 27712 1727096500.47746: getting the next task for host managed_node2 27712 1727096500.47750: done getting next task for host managed_node2 27712 1727096500.47753: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27712 1727096500.47756: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096500.47765: getting variables 27712 1727096500.47766: in VariableManager get_vars() 27712 1727096500.47797: Calling all_inventory to load vars for managed_node2 27712 1727096500.47807: Calling groups_inventory to load vars for managed_node2 27712 1727096500.47809: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096500.47815: Calling all_plugins_play to load vars for managed_node2 27712 1727096500.47817: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096500.47818: Calling groups_plugins_play to load vars for managed_node2 27712 1727096500.48522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096500.49381: done with get_vars() 27712 1727096500.49396: done getting variables 27712 1727096500.49435: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 09:01:40 -0400 (0:00:00.034) 0:00:26.187 ****** 27712 1727096500.49457: entering _queue_task() for managed_node2/debug 27712 1727096500.49641: worker is 1 (out of 1 available) 27712 1727096500.49655: exiting _queue_task() for managed_node2/debug 27712 1727096500.49666: done queuing things up, now waiting for results queue to drain 27712 1727096500.49669: waiting for pending results... 27712 1727096500.49830: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27712 1727096500.49915: in run() - task 0afff68d-5257-cbc7-8716-000000000080 27712 1727096500.49929: variable 'ansible_search_path' from source: unknown 27712 1727096500.49932: variable 'ansible_search_path' from source: unknown 27712 1727096500.49957: calling self._execute() 27712 1727096500.50034: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096500.50038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096500.50047: variable 'omit' from source: magic vars 27712 1727096500.50312: variable 'ansible_distribution_major_version' from source: facts 27712 1727096500.50321: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096500.50406: variable 'network_state' from source: role '' defaults 27712 1727096500.50414: Evaluated conditional (network_state != {}): False 27712 1727096500.50417: when evaluation is False, skipping this task 27712 1727096500.50420: _execute() done 27712 1727096500.50427: dumping result to json 27712 1727096500.50430: done dumping result, returning 27712 1727096500.50443: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-cbc7-8716-000000000080] 27712 1727096500.50447: sending task result for task 0afff68d-5257-cbc7-8716-000000000080 27712 1727096500.50525: done sending task result for task 0afff68d-5257-cbc7-8716-000000000080 27712 1727096500.50528: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 27712 1727096500.50593: no more pending results, returning what we have 27712 1727096500.50596: results queue empty 27712 1727096500.50596: checking for any_errors_fatal 27712 1727096500.50602: done checking for any_errors_fatal 27712 1727096500.50603: checking for max_fail_percentage 27712 1727096500.50604: done checking for max_fail_percentage 27712 1727096500.50605: checking to see if all hosts have failed and the running result is not ok 27712 1727096500.50606: done checking to see if all hosts have failed 27712 1727096500.50606: getting the remaining hosts for this loop 27712 1727096500.50607: done getting the remaining hosts for this loop 27712 1727096500.50610: getting the next task for host managed_node2 27712 1727096500.50615: done getting next task for host managed_node2 27712 1727096500.50618: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 27712 1727096500.50620: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096500.50634: getting variables 27712 1727096500.50635: in VariableManager get_vars() 27712 1727096500.50669: Calling all_inventory to load vars for managed_node2 27712 1727096500.50685: Calling groups_inventory to load vars for managed_node2 27712 1727096500.50688: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096500.50698: Calling all_plugins_play to load vars for managed_node2 27712 1727096500.50704: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096500.50707: Calling groups_plugins_play to load vars for managed_node2 27712 1727096500.51585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096500.52432: done with get_vars() 27712 1727096500.52447: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 09:01:40 -0400 (0:00:00.030) 0:00:26.218 ****** 27712 1727096500.52510: entering _queue_task() for managed_node2/ping 27712 1727096500.52719: worker is 1 (out of 1 available) 27712 1727096500.52729: exiting _queue_task() for managed_node2/ping 27712 1727096500.52740: done queuing things up, now waiting for results queue to drain 27712 1727096500.52741: waiting for pending results... 27712 1727096500.53184: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 27712 1727096500.53190: in run() - task 0afff68d-5257-cbc7-8716-000000000081 27712 1727096500.53193: variable 'ansible_search_path' from source: unknown 27712 1727096500.53195: variable 'ansible_search_path' from source: unknown 27712 1727096500.53206: calling self._execute() 27712 1727096500.53311: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096500.53324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096500.53337: variable 'omit' from source: magic vars 27712 1727096500.53709: variable 'ansible_distribution_major_version' from source: facts 27712 1727096500.53728: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096500.53739: variable 'omit' from source: magic vars 27712 1727096500.53804: variable 'omit' from source: magic vars 27712 1727096500.53840: variable 'omit' from source: magic vars 27712 1727096500.53888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096500.53924: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096500.53948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096500.53975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096500.53991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096500.54021: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096500.54028: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096500.54035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096500.54137: Set connection var ansible_connection to ssh 27712 1727096500.54182: Set connection var ansible_pipelining to False 27712 1727096500.54185: Set connection var ansible_timeout to 10 27712 1727096500.54187: Set connection var ansible_shell_type to sh 27712 1727096500.54189: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096500.54191: Set connection var ansible_shell_executable to /bin/sh 27712 1727096500.54215: variable 'ansible_shell_executable' from source: unknown 27712 1727096500.54222: variable 'ansible_connection' from source: unknown 27712 1727096500.54290: variable 'ansible_module_compression' from source: unknown 27712 1727096500.54293: variable 'ansible_shell_type' from source: unknown 27712 1727096500.54296: variable 'ansible_shell_executable' from source: unknown 27712 1727096500.54298: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096500.54300: variable 'ansible_pipelining' from source: unknown 27712 1727096500.54302: variable 'ansible_timeout' from source: unknown 27712 1727096500.54304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096500.54465: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096500.54476: variable 'omit' from source: magic vars 27712 1727096500.54481: starting attempt loop 27712 1727096500.54483: running the handler 27712 1727096500.54496: _low_level_execute_command(): starting 27712 1727096500.54502: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096500.55002: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096500.55007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096500.55011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096500.55060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096500.55063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096500.55108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096500.56775: stdout chunk (state=3): >>>/root <<< 27712 1727096500.56915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096500.56918: stdout chunk (state=3): >>><<< 27712 1727096500.56920: stderr chunk (state=3): >>><<< 27712 1727096500.57035: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096500.57038: _low_level_execute_command(): starting 27712 1727096500.57042: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096500.5694516-28967-100696098382477 `" && echo ansible-tmp-1727096500.5694516-28967-100696098382477="` echo /root/.ansible/tmp/ansible-tmp-1727096500.5694516-28967-100696098382477 `" ) && sleep 0' 27712 1727096500.57593: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096500.57611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096500.57684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096500.57732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096500.57745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096500.57764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096500.57853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096500.59735: stdout chunk (state=3): >>>ansible-tmp-1727096500.5694516-28967-100696098382477=/root/.ansible/tmp/ansible-tmp-1727096500.5694516-28967-100696098382477 <<< 27712 1727096500.59883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096500.59908: stdout chunk (state=3): >>><<< 27712 1727096500.59911: stderr chunk (state=3): >>><<< 27712 1727096500.60075: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096500.5694516-28967-100696098382477=/root/.ansible/tmp/ansible-tmp-1727096500.5694516-28967-100696098382477 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096500.60078: variable 'ansible_module_compression' from source: unknown 27712 1727096500.60080: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 27712 1727096500.60082: variable 'ansible_facts' from source: unknown 27712 1727096500.60146: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096500.5694516-28967-100696098382477/AnsiballZ_ping.py 27712 1727096500.60329: Sending initial data 27712 1727096500.60338: Sent initial data (153 bytes) 27712 1727096500.60931: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096500.60948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096500.60981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096500.61079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096500.61099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096500.61115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096500.61170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096500.62785: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096500.62811: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096500.62943: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096500.5694516-28967-100696098382477/AnsiballZ_ping.py" <<< 27712 1727096500.62947: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpltb6_w2y /root/.ansible/tmp/ansible-tmp-1727096500.5694516-28967-100696098382477/AnsiballZ_ping.py <<< 27712 1727096500.63000: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpltb6_w2y" to remote "/root/.ansible/tmp/ansible-tmp-1727096500.5694516-28967-100696098382477/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096500.5694516-28967-100696098382477/AnsiballZ_ping.py" <<< 27712 1727096500.63655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096500.63721: stderr chunk (state=3): >>><<< 27712 1727096500.63733: stdout chunk (state=3): >>><<< 27712 1727096500.63820: done transferring module to remote 27712 1727096500.63823: _low_level_execute_command(): starting 27712 1727096500.63825: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096500.5694516-28967-100696098382477/ /root/.ansible/tmp/ansible-tmp-1727096500.5694516-28967-100696098382477/AnsiballZ_ping.py && sleep 0' 27712 1727096500.64421: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096500.64485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096500.64547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096500.64563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096500.64592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096500.64652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096500.66559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096500.66567: stdout chunk (state=3): >>><<< 27712 1727096500.66755: stderr chunk (state=3): >>><<< 27712 1727096500.66758: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096500.66760: _low_level_execute_command(): starting 27712 1727096500.66763: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096500.5694516-28967-100696098382477/AnsiballZ_ping.py && sleep 0' 27712 1727096500.67729: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096500.67941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096500.67957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096500.68075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096500.68097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096500.68170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096500.83273: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 27712 1727096500.84645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096500.84672: stdout chunk (state=3): >>><<< 27712 1727096500.84689: stderr chunk (state=3): >>><<< 27712 1727096500.84710: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096500.84736: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096500.5694516-28967-100696098382477/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096500.84749: _low_level_execute_command(): starting 27712 1727096500.84758: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096500.5694516-28967-100696098382477/ > /dev/null 2>&1 && sleep 0' 27712 1727096500.85374: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096500.85378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096500.85380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096500.85383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096500.85385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096500.85387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096500.85475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096500.85512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096500.87480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096500.87484: stdout chunk (state=3): >>><<< 27712 1727096500.87486: stderr chunk (state=3): >>><<< 27712 1727096500.87537: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096500.87787: handler run complete 27712 1727096500.87790: attempt loop complete, returning result 27712 1727096500.87792: _execute() done 27712 1727096500.87794: dumping result to json 27712 1727096500.87796: done dumping result, returning 27712 1727096500.87798: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-cbc7-8716-000000000081] 27712 1727096500.87800: sending task result for task 0afff68d-5257-cbc7-8716-000000000081 27712 1727096500.87874: done sending task result for task 0afff68d-5257-cbc7-8716-000000000081 27712 1727096500.87879: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 27712 1727096500.87960: no more pending results, returning what we have 27712 1727096500.87964: results queue empty 27712 1727096500.87965: checking for any_errors_fatal 27712 1727096500.87981: done checking for any_errors_fatal 27712 1727096500.87982: checking for max_fail_percentage 27712 1727096500.87984: done checking for max_fail_percentage 27712 1727096500.87984: checking to see if all hosts have failed and the running result is not ok 27712 1727096500.87986: done checking to see if all hosts have failed 27712 1727096500.87986: getting the remaining hosts for this loop 27712 1727096500.87988: done getting the remaining hosts for this loop 27712 1727096500.87992: getting the next task for host managed_node2 27712 1727096500.88003: done getting next task for host managed_node2 27712 1727096500.88005: ^ task is: TASK: meta (role_complete) 27712 1727096500.88009: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096500.88021: getting variables 27712 1727096500.88023: in VariableManager get_vars() 27712 1727096500.88106: Calling all_inventory to load vars for managed_node2 27712 1727096500.88110: Calling groups_inventory to load vars for managed_node2 27712 1727096500.88113: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096500.88124: Calling all_plugins_play to load vars for managed_node2 27712 1727096500.88127: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096500.88129: Calling groups_plugins_play to load vars for managed_node2 27712 1727096500.89522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096500.92081: done with get_vars() 27712 1727096500.92111: done getting variables 27712 1727096500.92189: done queuing things up, now waiting for results queue to drain 27712 1727096500.92191: results queue empty 27712 1727096500.92192: checking for any_errors_fatal 27712 1727096500.92194: done checking for any_errors_fatal 27712 1727096500.92195: checking for max_fail_percentage 27712 1727096500.92196: done checking for max_fail_percentage 27712 1727096500.92197: checking to see if all hosts have failed and the running result is not ok 27712 1727096500.92198: done checking to see if all hosts have failed 27712 1727096500.92198: getting the remaining hosts for this loop 27712 1727096500.92199: done getting the remaining hosts for this loop 27712 1727096500.92202: getting the next task for host managed_node2 27712 1727096500.92205: done getting next task for host managed_node2 27712 1727096500.92207: ^ task is: TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider 27712 1727096500.92209: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096500.92211: getting variables 27712 1727096500.92216: in VariableManager get_vars() 27712 1727096500.92230: Calling all_inventory to load vars for managed_node2 27712 1727096500.92232: Calling groups_inventory to load vars for managed_node2 27712 1727096500.92234: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096500.92239: Calling all_plugins_play to load vars for managed_node2 27712 1727096500.92241: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096500.92244: Calling groups_plugins_play to load vars for managed_node2 27712 1727096500.93507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096500.95069: done with get_vars() 27712 1727096500.95090: done getting variables 27712 1727096500.95134: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the warning about specifying the route without the output device is logged for initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:122 Monday 23 September 2024 09:01:40 -0400 (0:00:00.426) 0:00:26.645 ****** 27712 1727096500.95172: entering _queue_task() for managed_node2/assert 27712 1727096500.95551: worker is 1 (out of 1 available) 27712 1727096500.95562: exiting _queue_task() for managed_node2/assert 27712 1727096500.95779: done queuing things up, now waiting for results queue to drain 27712 1727096500.95781: waiting for pending results... 27712 1727096500.95875: running TaskExecutor() for managed_node2/TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider 27712 1727096500.95960: in run() - task 0afff68d-5257-cbc7-8716-0000000000b1 27712 1727096500.95976: variable 'ansible_search_path' from source: unknown 27712 1727096500.96006: calling self._execute() 27712 1727096500.96093: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096500.96099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096500.96108: variable 'omit' from source: magic vars 27712 1727096500.96387: variable 'ansible_distribution_major_version' from source: facts 27712 1727096500.96396: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096500.96478: variable 'network_provider' from source: set_fact 27712 1727096500.96483: Evaluated conditional (network_provider == "initscripts"): False 27712 1727096500.96486: when evaluation is False, skipping this task 27712 1727096500.96489: _execute() done 27712 1727096500.96492: dumping result to json 27712 1727096500.96496: done dumping result, returning 27712 1727096500.96502: done running TaskExecutor() for managed_node2/TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider [0afff68d-5257-cbc7-8716-0000000000b1] 27712 1727096500.96507: sending task result for task 0afff68d-5257-cbc7-8716-0000000000b1 27712 1727096500.96601: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000b1 27712 1727096500.96604: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 27712 1727096500.96650: no more pending results, returning what we have 27712 1727096500.96654: results queue empty 27712 1727096500.96655: checking for any_errors_fatal 27712 1727096500.96656: done checking for any_errors_fatal 27712 1727096500.96657: checking for max_fail_percentage 27712 1727096500.96658: done checking for max_fail_percentage 27712 1727096500.96659: checking to see if all hosts have failed and the running result is not ok 27712 1727096500.96659: done checking to see if all hosts have failed 27712 1727096500.96660: getting the remaining hosts for this loop 27712 1727096500.96662: done getting the remaining hosts for this loop 27712 1727096500.96665: getting the next task for host managed_node2 27712 1727096500.96674: done getting next task for host managed_node2 27712 1727096500.96677: ^ task is: TASK: Assert that no warning is logged for nm provider 27712 1727096500.96679: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096500.96683: getting variables 27712 1727096500.96684: in VariableManager get_vars() 27712 1727096500.96722: Calling all_inventory to load vars for managed_node2 27712 1727096500.96725: Calling groups_inventory to load vars for managed_node2 27712 1727096500.96727: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096500.96736: Calling all_plugins_play to load vars for managed_node2 27712 1727096500.96739: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096500.96741: Calling groups_plugins_play to load vars for managed_node2 27712 1727096500.97495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096500.98826: done with get_vars() 27712 1727096500.98840: done getting variables 27712 1727096500.98886: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that no warning is logged for nm provider] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:129 Monday 23 September 2024 09:01:40 -0400 (0:00:00.037) 0:00:26.682 ****** 27712 1727096500.98905: entering _queue_task() for managed_node2/assert 27712 1727096500.99101: worker is 1 (out of 1 available) 27712 1727096500.99113: exiting _queue_task() for managed_node2/assert 27712 1727096500.99125: done queuing things up, now waiting for results queue to drain 27712 1727096500.99126: waiting for pending results... 27712 1727096500.99300: running TaskExecutor() for managed_node2/TASK: Assert that no warning is logged for nm provider 27712 1727096500.99359: in run() - task 0afff68d-5257-cbc7-8716-0000000000b2 27712 1727096500.99372: variable 'ansible_search_path' from source: unknown 27712 1727096500.99402: calling self._execute() 27712 1727096500.99483: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096500.99489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096500.99498: variable 'omit' from source: magic vars 27712 1727096500.99761: variable 'ansible_distribution_major_version' from source: facts 27712 1727096500.99771: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096500.99852: variable 'network_provider' from source: set_fact 27712 1727096500.99856: Evaluated conditional (network_provider == "nm"): True 27712 1727096500.99862: variable 'omit' from source: magic vars 27712 1727096500.99884: variable 'omit' from source: magic vars 27712 1727096500.99914: variable 'omit' from source: magic vars 27712 1727096500.99945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096500.99972: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096500.99991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096501.00009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096501.00015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096501.00036: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096501.00039: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096501.00041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096501.00111: Set connection var ansible_connection to ssh 27712 1727096501.00122: Set connection var ansible_pipelining to False 27712 1727096501.00125: Set connection var ansible_timeout to 10 27712 1727096501.00127: Set connection var ansible_shell_type to sh 27712 1727096501.00132: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096501.00137: Set connection var ansible_shell_executable to /bin/sh 27712 1727096501.00153: variable 'ansible_shell_executable' from source: unknown 27712 1727096501.00156: variable 'ansible_connection' from source: unknown 27712 1727096501.00159: variable 'ansible_module_compression' from source: unknown 27712 1727096501.00161: variable 'ansible_shell_type' from source: unknown 27712 1727096501.00163: variable 'ansible_shell_executable' from source: unknown 27712 1727096501.00165: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096501.00171: variable 'ansible_pipelining' from source: unknown 27712 1727096501.00176: variable 'ansible_timeout' from source: unknown 27712 1727096501.00180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096501.00280: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096501.00289: variable 'omit' from source: magic vars 27712 1727096501.00295: starting attempt loop 27712 1727096501.00297: running the handler 27712 1727096501.00428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096501.00773: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096501.00777: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096501.01012: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096501.01052: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096501.01166: variable '__network_connections_result' from source: set_fact 27712 1727096501.01202: Evaluated conditional (__network_connections_result.stderr is not search("")): True 27712 1727096501.01225: handler run complete 27712 1727096501.01244: attempt loop complete, returning result 27712 1727096501.01251: _execute() done 27712 1727096501.01257: dumping result to json 27712 1727096501.01264: done dumping result, returning 27712 1727096501.01280: done running TaskExecutor() for managed_node2/TASK: Assert that no warning is logged for nm provider [0afff68d-5257-cbc7-8716-0000000000b2] 27712 1727096501.01289: sending task result for task 0afff68d-5257-cbc7-8716-0000000000b2 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27712 1727096501.01427: no more pending results, returning what we have 27712 1727096501.01430: results queue empty 27712 1727096501.01431: checking for any_errors_fatal 27712 1727096501.01439: done checking for any_errors_fatal 27712 1727096501.01439: checking for max_fail_percentage 27712 1727096501.01441: done checking for max_fail_percentage 27712 1727096501.01441: checking to see if all hosts have failed and the running result is not ok 27712 1727096501.01442: done checking to see if all hosts have failed 27712 1727096501.01443: getting the remaining hosts for this loop 27712 1727096501.01444: done getting the remaining hosts for this loop 27712 1727096501.01448: getting the next task for host managed_node2 27712 1727096501.01457: done getting next task for host managed_node2 27712 1727096501.01459: ^ task is: TASK: Bring down test devices and profiles 27712 1727096501.01462: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096501.01467: getting variables 27712 1727096501.01470: in VariableManager get_vars() 27712 1727096501.01511: Calling all_inventory to load vars for managed_node2 27712 1727096501.01514: Calling groups_inventory to load vars for managed_node2 27712 1727096501.01516: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096501.01527: Calling all_plugins_play to load vars for managed_node2 27712 1727096501.01530: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096501.01533: Calling groups_plugins_play to load vars for managed_node2 27712 1727096501.02206: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000b2 27712 1727096501.02210: WORKER PROCESS EXITING 27712 1727096501.06224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096501.07233: done with get_vars() 27712 1727096501.07253: done getting variables TASK [Bring down test devices and profiles] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:140 Monday 23 September 2024 09:01:41 -0400 (0:00:00.084) 0:00:26.766 ****** 27712 1727096501.07336: entering _queue_task() for managed_node2/include_role 27712 1727096501.07338: Creating lock for include_role 27712 1727096501.07677: worker is 1 (out of 1 available) 27712 1727096501.07689: exiting _queue_task() for managed_node2/include_role 27712 1727096501.07699: done queuing things up, now waiting for results queue to drain 27712 1727096501.07701: waiting for pending results... 27712 1727096501.07945: running TaskExecutor() for managed_node2/TASK: Bring down test devices and profiles 27712 1727096501.08036: in run() - task 0afff68d-5257-cbc7-8716-0000000000b4 27712 1727096501.08047: variable 'ansible_search_path' from source: unknown 27712 1727096501.08080: calling self._execute() 27712 1727096501.08164: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096501.08171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096501.08182: variable 'omit' from source: magic vars 27712 1727096501.08464: variable 'ansible_distribution_major_version' from source: facts 27712 1727096501.08477: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096501.08482: _execute() done 27712 1727096501.08485: dumping result to json 27712 1727096501.08489: done dumping result, returning 27712 1727096501.08495: done running TaskExecutor() for managed_node2/TASK: Bring down test devices and profiles [0afff68d-5257-cbc7-8716-0000000000b4] 27712 1727096501.08500: sending task result for task 0afff68d-5257-cbc7-8716-0000000000b4 27712 1727096501.08614: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000b4 27712 1727096501.08617: WORKER PROCESS EXITING 27712 1727096501.08643: no more pending results, returning what we have 27712 1727096501.08648: in VariableManager get_vars() 27712 1727096501.08691: Calling all_inventory to load vars for managed_node2 27712 1727096501.08694: Calling groups_inventory to load vars for managed_node2 27712 1727096501.08696: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096501.08708: Calling all_plugins_play to load vars for managed_node2 27712 1727096501.08710: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096501.08713: Calling groups_plugins_play to load vars for managed_node2 27712 1727096501.09465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096501.10424: done with get_vars() 27712 1727096501.10436: variable 'ansible_search_path' from source: unknown 27712 1727096501.10578: variable 'omit' from source: magic vars 27712 1727096501.10599: variable 'omit' from source: magic vars 27712 1727096501.10608: variable 'omit' from source: magic vars 27712 1727096501.10610: we have included files to process 27712 1727096501.10611: generating all_blocks data 27712 1727096501.10614: done generating all_blocks data 27712 1727096501.10618: processing included file: fedora.linux_system_roles.network 27712 1727096501.10631: in VariableManager get_vars() 27712 1727096501.10642: done with get_vars() 27712 1727096501.10661: in VariableManager get_vars() 27712 1727096501.10675: done with get_vars() 27712 1727096501.10702: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 27712 1727096501.10773: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 27712 1727096501.10816: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 27712 1727096501.11071: in VariableManager get_vars() 27712 1727096501.11086: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27712 1727096501.12309: iterating over new_blocks loaded from include file 27712 1727096501.12311: in VariableManager get_vars() 27712 1727096501.12323: done with get_vars() 27712 1727096501.12324: filtering new block on tags 27712 1727096501.12456: done filtering new block on tags 27712 1727096501.12458: in VariableManager get_vars() 27712 1727096501.12471: done with get_vars() 27712 1727096501.12472: filtering new block on tags 27712 1727096501.12485: done filtering new block on tags 27712 1727096501.12486: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 27712 1727096501.12490: extending task lists for all hosts with included blocks 27712 1727096501.12604: done extending task lists 27712 1727096501.12605: done processing included files 27712 1727096501.12605: results queue empty 27712 1727096501.12606: checking for any_errors_fatal 27712 1727096501.12608: done checking for any_errors_fatal 27712 1727096501.12609: checking for max_fail_percentage 27712 1727096501.12609: done checking for max_fail_percentage 27712 1727096501.12610: checking to see if all hosts have failed and the running result is not ok 27712 1727096501.12610: done checking to see if all hosts have failed 27712 1727096501.12611: getting the remaining hosts for this loop 27712 1727096501.12611: done getting the remaining hosts for this loop 27712 1727096501.12613: getting the next task for host managed_node2 27712 1727096501.12615: done getting next task for host managed_node2 27712 1727096501.12617: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27712 1727096501.12619: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096501.12625: getting variables 27712 1727096501.12626: in VariableManager get_vars() 27712 1727096501.12635: Calling all_inventory to load vars for managed_node2 27712 1727096501.12636: Calling groups_inventory to load vars for managed_node2 27712 1727096501.12638: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096501.12641: Calling all_plugins_play to load vars for managed_node2 27712 1727096501.12642: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096501.12644: Calling groups_plugins_play to load vars for managed_node2 27712 1727096501.13255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096501.14106: done with get_vars() 27712 1727096501.14122: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 09:01:41 -0400 (0:00:00.068) 0:00:26.834 ****** 27712 1727096501.14165: entering _queue_task() for managed_node2/include_tasks 27712 1727096501.14398: worker is 1 (out of 1 available) 27712 1727096501.14412: exiting _queue_task() for managed_node2/include_tasks 27712 1727096501.14424: done queuing things up, now waiting for results queue to drain 27712 1727096501.14425: waiting for pending results... 27712 1727096501.14594: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 27712 1727096501.14683: in run() - task 0afff68d-5257-cbc7-8716-000000000641 27712 1727096501.14694: variable 'ansible_search_path' from source: unknown 27712 1727096501.14697: variable 'ansible_search_path' from source: unknown 27712 1727096501.14724: calling self._execute() 27712 1727096501.14800: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096501.14805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096501.14811: variable 'omit' from source: magic vars 27712 1727096501.15084: variable 'ansible_distribution_major_version' from source: facts 27712 1727096501.15096: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096501.15100: _execute() done 27712 1727096501.15102: dumping result to json 27712 1727096501.15105: done dumping result, returning 27712 1727096501.15112: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-cbc7-8716-000000000641] 27712 1727096501.15116: sending task result for task 0afff68d-5257-cbc7-8716-000000000641 27712 1727096501.15198: done sending task result for task 0afff68d-5257-cbc7-8716-000000000641 27712 1727096501.15200: WORKER PROCESS EXITING 27712 1727096501.15245: no more pending results, returning what we have 27712 1727096501.15249: in VariableManager get_vars() 27712 1727096501.15293: Calling all_inventory to load vars for managed_node2 27712 1727096501.15296: Calling groups_inventory to load vars for managed_node2 27712 1727096501.15299: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096501.15311: Calling all_plugins_play to load vars for managed_node2 27712 1727096501.15314: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096501.15316: Calling groups_plugins_play to load vars for managed_node2 27712 1727096501.16136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096501.16998: done with get_vars() 27712 1727096501.17011: variable 'ansible_search_path' from source: unknown 27712 1727096501.17012: variable 'ansible_search_path' from source: unknown 27712 1727096501.17035: we have included files to process 27712 1727096501.17036: generating all_blocks data 27712 1727096501.17037: done generating all_blocks data 27712 1727096501.17039: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27712 1727096501.17040: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27712 1727096501.17041: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 27712 1727096501.17397: done processing included file 27712 1727096501.17398: iterating over new_blocks loaded from include file 27712 1727096501.17399: in VariableManager get_vars() 27712 1727096501.17414: done with get_vars() 27712 1727096501.17415: filtering new block on tags 27712 1727096501.17434: done filtering new block on tags 27712 1727096501.17436: in VariableManager get_vars() 27712 1727096501.17451: done with get_vars() 27712 1727096501.17452: filtering new block on tags 27712 1727096501.17479: done filtering new block on tags 27712 1727096501.17481: in VariableManager get_vars() 27712 1727096501.17494: done with get_vars() 27712 1727096501.17495: filtering new block on tags 27712 1727096501.17516: done filtering new block on tags 27712 1727096501.17517: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 27712 1727096501.17521: extending task lists for all hosts with included blocks 27712 1727096501.18078: done extending task lists 27712 1727096501.18080: done processing included files 27712 1727096501.18080: results queue empty 27712 1727096501.18081: checking for any_errors_fatal 27712 1727096501.18083: done checking for any_errors_fatal 27712 1727096501.18084: checking for max_fail_percentage 27712 1727096501.18085: done checking for max_fail_percentage 27712 1727096501.18086: checking to see if all hosts have failed and the running result is not ok 27712 1727096501.18086: done checking to see if all hosts have failed 27712 1727096501.18087: getting the remaining hosts for this loop 27712 1727096501.18087: done getting the remaining hosts for this loop 27712 1727096501.18089: getting the next task for host managed_node2 27712 1727096501.18092: done getting next task for host managed_node2 27712 1727096501.18093: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27712 1727096501.18096: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096501.18102: getting variables 27712 1727096501.18102: in VariableManager get_vars() 27712 1727096501.18112: Calling all_inventory to load vars for managed_node2 27712 1727096501.18113: Calling groups_inventory to load vars for managed_node2 27712 1727096501.18115: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096501.18118: Calling all_plugins_play to load vars for managed_node2 27712 1727096501.18119: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096501.18121: Calling groups_plugins_play to load vars for managed_node2 27712 1727096501.18772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096501.19613: done with get_vars() 27712 1727096501.19626: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 09:01:41 -0400 (0:00:00.055) 0:00:26.890 ****** 27712 1727096501.19673: entering _queue_task() for managed_node2/setup 27712 1727096501.19896: worker is 1 (out of 1 available) 27712 1727096501.19909: exiting _queue_task() for managed_node2/setup 27712 1727096501.19920: done queuing things up, now waiting for results queue to drain 27712 1727096501.19922: waiting for pending results... 27712 1727096501.20094: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 27712 1727096501.20190: in run() - task 0afff68d-5257-cbc7-8716-0000000006a7 27712 1727096501.20202: variable 'ansible_search_path' from source: unknown 27712 1727096501.20206: variable 'ansible_search_path' from source: unknown 27712 1727096501.20233: calling self._execute() 27712 1727096501.20307: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096501.20311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096501.20320: variable 'omit' from source: magic vars 27712 1727096501.20588: variable 'ansible_distribution_major_version' from source: facts 27712 1727096501.20597: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096501.20736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096501.22195: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096501.22241: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096501.22268: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096501.22295: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096501.22315: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096501.22375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096501.22394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096501.22412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096501.22441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096501.22452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096501.22491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096501.22507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096501.22523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096501.22551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096501.22561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096501.22664: variable '__network_required_facts' from source: role '' defaults 27712 1727096501.22675: variable 'ansible_facts' from source: unknown 27712 1727096501.23104: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 27712 1727096501.23108: when evaluation is False, skipping this task 27712 1727096501.23111: _execute() done 27712 1727096501.23113: dumping result to json 27712 1727096501.23115: done dumping result, returning 27712 1727096501.23122: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-cbc7-8716-0000000006a7] 27712 1727096501.23127: sending task result for task 0afff68d-5257-cbc7-8716-0000000006a7 27712 1727096501.23208: done sending task result for task 0afff68d-5257-cbc7-8716-0000000006a7 27712 1727096501.23210: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096501.23252: no more pending results, returning what we have 27712 1727096501.23256: results queue empty 27712 1727096501.23257: checking for any_errors_fatal 27712 1727096501.23259: done checking for any_errors_fatal 27712 1727096501.23259: checking for max_fail_percentage 27712 1727096501.23261: done checking for max_fail_percentage 27712 1727096501.23261: checking to see if all hosts have failed and the running result is not ok 27712 1727096501.23262: done checking to see if all hosts have failed 27712 1727096501.23263: getting the remaining hosts for this loop 27712 1727096501.23264: done getting the remaining hosts for this loop 27712 1727096501.23269: getting the next task for host managed_node2 27712 1727096501.23280: done getting next task for host managed_node2 27712 1727096501.23283: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 27712 1727096501.23288: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096501.23305: getting variables 27712 1727096501.23306: in VariableManager get_vars() 27712 1727096501.23346: Calling all_inventory to load vars for managed_node2 27712 1727096501.23348: Calling groups_inventory to load vars for managed_node2 27712 1727096501.23350: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096501.23359: Calling all_plugins_play to load vars for managed_node2 27712 1727096501.23361: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096501.23363: Calling groups_plugins_play to load vars for managed_node2 27712 1727096501.24141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096501.25066: done with get_vars() 27712 1727096501.25085: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 09:01:41 -0400 (0:00:00.054) 0:00:26.944 ****** 27712 1727096501.25155: entering _queue_task() for managed_node2/stat 27712 1727096501.25375: worker is 1 (out of 1 available) 27712 1727096501.25389: exiting _queue_task() for managed_node2/stat 27712 1727096501.25401: done queuing things up, now waiting for results queue to drain 27712 1727096501.25402: waiting for pending results... 27712 1727096501.25580: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 27712 1727096501.25678: in run() - task 0afff68d-5257-cbc7-8716-0000000006a9 27712 1727096501.25689: variable 'ansible_search_path' from source: unknown 27712 1727096501.25693: variable 'ansible_search_path' from source: unknown 27712 1727096501.25720: calling self._execute() 27712 1727096501.25801: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096501.25805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096501.25813: variable 'omit' from source: magic vars 27712 1727096501.26092: variable 'ansible_distribution_major_version' from source: facts 27712 1727096501.26101: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096501.26212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096501.26400: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096501.26430: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096501.26455: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096501.26485: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096501.26570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096501.26772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096501.26776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096501.26778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096501.26781: variable '__network_is_ostree' from source: set_fact 27712 1727096501.26783: Evaluated conditional (not __network_is_ostree is defined): False 27712 1727096501.26786: when evaluation is False, skipping this task 27712 1727096501.26789: _execute() done 27712 1727096501.26791: dumping result to json 27712 1727096501.26793: done dumping result, returning 27712 1727096501.26795: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-cbc7-8716-0000000006a9] 27712 1727096501.26797: sending task result for task 0afff68d-5257-cbc7-8716-0000000006a9 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27712 1727096501.26962: no more pending results, returning what we have 27712 1727096501.26966: results queue empty 27712 1727096501.26969: checking for any_errors_fatal 27712 1727096501.26976: done checking for any_errors_fatal 27712 1727096501.26977: checking for max_fail_percentage 27712 1727096501.26979: done checking for max_fail_percentage 27712 1727096501.26980: checking to see if all hosts have failed and the running result is not ok 27712 1727096501.26981: done checking to see if all hosts have failed 27712 1727096501.26982: getting the remaining hosts for this loop 27712 1727096501.26984: done getting the remaining hosts for this loop 27712 1727096501.26987: getting the next task for host managed_node2 27712 1727096501.26995: done getting next task for host managed_node2 27712 1727096501.26998: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27712 1727096501.27004: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096501.27142: getting variables 27712 1727096501.27144: in VariableManager get_vars() 27712 1727096501.27193: Calling all_inventory to load vars for managed_node2 27712 1727096501.27196: Calling groups_inventory to load vars for managed_node2 27712 1727096501.27198: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096501.27208: Calling all_plugins_play to load vars for managed_node2 27712 1727096501.27211: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096501.27214: Calling groups_plugins_play to load vars for managed_node2 27712 1727096501.27752: done sending task result for task 0afff68d-5257-cbc7-8716-0000000006a9 27712 1727096501.27755: WORKER PROCESS EXITING 27712 1727096501.28476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096501.29320: done with get_vars() 27712 1727096501.29334: done getting variables 27712 1727096501.29377: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 09:01:41 -0400 (0:00:00.042) 0:00:26.987 ****** 27712 1727096501.29402: entering _queue_task() for managed_node2/set_fact 27712 1727096501.29604: worker is 1 (out of 1 available) 27712 1727096501.29617: exiting _queue_task() for managed_node2/set_fact 27712 1727096501.29629: done queuing things up, now waiting for results queue to drain 27712 1727096501.29630: waiting for pending results... 27712 1727096501.29831: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 27712 1727096501.30007: in run() - task 0afff68d-5257-cbc7-8716-0000000006aa 27712 1727096501.30026: variable 'ansible_search_path' from source: unknown 27712 1727096501.30033: variable 'ansible_search_path' from source: unknown 27712 1727096501.30080: calling self._execute() 27712 1727096501.30195: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096501.30206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096501.30221: variable 'omit' from source: magic vars 27712 1727096501.30879: variable 'ansible_distribution_major_version' from source: facts 27712 1727096501.30895: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096501.31057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096501.31316: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096501.31362: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096501.31404: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096501.31441: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096501.31676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096501.31679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096501.31682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096501.31684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096501.31695: variable '__network_is_ostree' from source: set_fact 27712 1727096501.31709: Evaluated conditional (not __network_is_ostree is defined): False 27712 1727096501.31717: when evaluation is False, skipping this task 27712 1727096501.31723: _execute() done 27712 1727096501.31730: dumping result to json 27712 1727096501.31738: done dumping result, returning 27712 1727096501.31749: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-cbc7-8716-0000000006aa] 27712 1727096501.31758: sending task result for task 0afff68d-5257-cbc7-8716-0000000006aa skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 27712 1727096501.31900: no more pending results, returning what we have 27712 1727096501.31904: results queue empty 27712 1727096501.31905: checking for any_errors_fatal 27712 1727096501.31911: done checking for any_errors_fatal 27712 1727096501.31911: checking for max_fail_percentage 27712 1727096501.31913: done checking for max_fail_percentage 27712 1727096501.31914: checking to see if all hosts have failed and the running result is not ok 27712 1727096501.31914: done checking to see if all hosts have failed 27712 1727096501.31915: getting the remaining hosts for this loop 27712 1727096501.31916: done getting the remaining hosts for this loop 27712 1727096501.31919: getting the next task for host managed_node2 27712 1727096501.31928: done getting next task for host managed_node2 27712 1727096501.31932: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 27712 1727096501.31937: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096501.31958: getting variables 27712 1727096501.31959: in VariableManager get_vars() 27712 1727096501.32000: Calling all_inventory to load vars for managed_node2 27712 1727096501.32003: Calling groups_inventory to load vars for managed_node2 27712 1727096501.32005: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096501.32014: Calling all_plugins_play to load vars for managed_node2 27712 1727096501.32016: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096501.32019: Calling groups_plugins_play to load vars for managed_node2 27712 1727096501.32686: done sending task result for task 0afff68d-5257-cbc7-8716-0000000006aa 27712 1727096501.32689: WORKER PROCESS EXITING 27712 1727096501.33378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096501.34984: done with get_vars() 27712 1727096501.35010: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 09:01:41 -0400 (0:00:00.057) 0:00:27.044 ****** 27712 1727096501.35112: entering _queue_task() for managed_node2/service_facts 27712 1727096501.35394: worker is 1 (out of 1 available) 27712 1727096501.35407: exiting _queue_task() for managed_node2/service_facts 27712 1727096501.35419: done queuing things up, now waiting for results queue to drain 27712 1727096501.35423: waiting for pending results... 27712 1727096501.35787: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 27712 1727096501.36014: in run() - task 0afff68d-5257-cbc7-8716-0000000006ac 27712 1727096501.36018: variable 'ansible_search_path' from source: unknown 27712 1727096501.36020: variable 'ansible_search_path' from source: unknown 27712 1727096501.36023: calling self._execute() 27712 1727096501.36071: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096501.36084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096501.36099: variable 'omit' from source: magic vars 27712 1727096501.36496: variable 'ansible_distribution_major_version' from source: facts 27712 1727096501.36513: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096501.36525: variable 'omit' from source: magic vars 27712 1727096501.36605: variable 'omit' from source: magic vars 27712 1727096501.36641: variable 'omit' from source: magic vars 27712 1727096501.36774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096501.36778: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096501.36781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096501.36785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096501.36800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096501.36834: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096501.36844: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096501.36904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096501.36966: Set connection var ansible_connection to ssh 27712 1727096501.36984: Set connection var ansible_pipelining to False 27712 1727096501.36996: Set connection var ansible_timeout to 10 27712 1727096501.37008: Set connection var ansible_shell_type to sh 27712 1727096501.37025: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096501.37037: Set connection var ansible_shell_executable to /bin/sh 27712 1727096501.37064: variable 'ansible_shell_executable' from source: unknown 27712 1727096501.37076: variable 'ansible_connection' from source: unknown 27712 1727096501.37122: variable 'ansible_module_compression' from source: unknown 27712 1727096501.37125: variable 'ansible_shell_type' from source: unknown 27712 1727096501.37127: variable 'ansible_shell_executable' from source: unknown 27712 1727096501.37129: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096501.37131: variable 'ansible_pipelining' from source: unknown 27712 1727096501.37133: variable 'ansible_timeout' from source: unknown 27712 1727096501.37135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096501.37321: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096501.37347: variable 'omit' from source: magic vars 27712 1727096501.37373: starting attempt loop 27712 1727096501.37376: running the handler 27712 1727096501.37385: _low_level_execute_command(): starting 27712 1727096501.37449: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096501.38165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096501.38194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096501.38213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096501.38286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096501.40185: stdout chunk (state=3): >>>/root <<< 27712 1727096501.40223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096501.40229: stdout chunk (state=3): >>><<< 27712 1727096501.40237: stderr chunk (state=3): >>><<< 27712 1727096501.40255: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096501.40270: _low_level_execute_command(): starting 27712 1727096501.40280: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096501.4025583-29004-263017135869166 `" && echo ansible-tmp-1727096501.4025583-29004-263017135869166="` echo /root/.ansible/tmp/ansible-tmp-1727096501.4025583-29004-263017135869166 `" ) && sleep 0' 27712 1727096501.40849: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096501.40859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096501.40872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096501.40890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096501.40903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096501.40910: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096501.40919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096501.40946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096501.40949: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096501.40952: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27712 1727096501.40954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096501.40977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096501.41056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096501.41060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096501.41062: stderr chunk (state=3): >>>debug2: match found <<< 27712 1727096501.41065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096501.41072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096501.41091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096501.41102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096501.41165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096501.43051: stdout chunk (state=3): >>>ansible-tmp-1727096501.4025583-29004-263017135869166=/root/.ansible/tmp/ansible-tmp-1727096501.4025583-29004-263017135869166 <<< 27712 1727096501.43192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096501.43221: stderr chunk (state=3): >>><<< 27712 1727096501.43232: stdout chunk (state=3): >>><<< 27712 1727096501.43476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096501.4025583-29004-263017135869166=/root/.ansible/tmp/ansible-tmp-1727096501.4025583-29004-263017135869166 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096501.43479: variable 'ansible_module_compression' from source: unknown 27712 1727096501.43482: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 27712 1727096501.43484: variable 'ansible_facts' from source: unknown 27712 1727096501.43499: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096501.4025583-29004-263017135869166/AnsiballZ_service_facts.py 27712 1727096501.43736: Sending initial data 27712 1727096501.43739: Sent initial data (162 bytes) 27712 1727096501.44328: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096501.44823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096501.44834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096501.44897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096501.46483: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 27712 1727096501.46487: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096501.46516: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096501.46579: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmplyxd3t4p /root/.ansible/tmp/ansible-tmp-1727096501.4025583-29004-263017135869166/AnsiballZ_service_facts.py <<< 27712 1727096501.46585: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096501.4025583-29004-263017135869166/AnsiballZ_service_facts.py" <<< 27712 1727096501.46691: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmplyxd3t4p" to remote "/root/.ansible/tmp/ansible-tmp-1727096501.4025583-29004-263017135869166/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096501.4025583-29004-263017135869166/AnsiballZ_service_facts.py" <<< 27712 1727096501.47956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096501.48016: stderr chunk (state=3): >>><<< 27712 1727096501.48019: stdout chunk (state=3): >>><<< 27712 1727096501.48061: done transferring module to remote 27712 1727096501.48073: _low_level_execute_command(): starting 27712 1727096501.48076: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096501.4025583-29004-263017135869166/ /root/.ansible/tmp/ansible-tmp-1727096501.4025583-29004-263017135869166/AnsiballZ_service_facts.py && sleep 0' 27712 1727096501.49234: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096501.49243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096501.49261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096501.49273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096501.49463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096501.49623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096501.50085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096501.51787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096501.51831: stderr chunk (state=3): >>><<< 27712 1727096501.51849: stdout chunk (state=3): >>><<< 27712 1727096501.51869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096501.51874: _low_level_execute_command(): starting 27712 1727096501.51878: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096501.4025583-29004-263017135869166/AnsiballZ_service_facts.py && sleep 0' 27712 1727096501.52442: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096501.52453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096501.52575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096501.52578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096501.52581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096501.52583: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096501.52585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096501.52587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096501.52589: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096501.52591: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27712 1727096501.52593: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096501.52594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096501.52596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096501.52598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096501.52600: stderr chunk (state=3): >>>debug2: match found <<< 27712 1727096501.52601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096501.52657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096501.52674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096501.52715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096501.52779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096503.09203: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 27712 1727096503.09220: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 27712 1727096503.09245: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 27712 1727096503.09276: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 27712 1727096503.09283: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 27712 1727096503.09289: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 27712 1727096503.10914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096503.10918: stdout chunk (state=3): >>><<< 27712 1727096503.10922: stderr chunk (state=3): >>><<< 27712 1727096503.10963: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096503.12126: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096501.4025583-29004-263017135869166/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096503.12137: _low_level_execute_command(): starting 27712 1727096503.12140: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096501.4025583-29004-263017135869166/ > /dev/null 2>&1 && sleep 0' 27712 1727096503.12695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096503.12789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096503.12821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096503.12844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096503.12861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096503.12931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096503.14845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096503.14850: stdout chunk (state=3): >>><<< 27712 1727096503.14856: stderr chunk (state=3): >>><<< 27712 1727096503.14879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096503.14885: handler run complete 27712 1727096503.15173: variable 'ansible_facts' from source: unknown 27712 1727096503.15254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096503.15786: variable 'ansible_facts' from source: unknown 27712 1727096503.15929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096503.16136: attempt loop complete, returning result 27712 1727096503.16141: _execute() done 27712 1727096503.16144: dumping result to json 27712 1727096503.16218: done dumping result, returning 27712 1727096503.16227: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-cbc7-8716-0000000006ac] 27712 1727096503.16232: sending task result for task 0afff68d-5257-cbc7-8716-0000000006ac 27712 1727096503.17494: done sending task result for task 0afff68d-5257-cbc7-8716-0000000006ac 27712 1727096503.17497: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096503.17603: no more pending results, returning what we have 27712 1727096503.17607: results queue empty 27712 1727096503.17608: checking for any_errors_fatal 27712 1727096503.17612: done checking for any_errors_fatal 27712 1727096503.17612: checking for max_fail_percentage 27712 1727096503.17614: done checking for max_fail_percentage 27712 1727096503.17615: checking to see if all hosts have failed and the running result is not ok 27712 1727096503.17616: done checking to see if all hosts have failed 27712 1727096503.17616: getting the remaining hosts for this loop 27712 1727096503.17617: done getting the remaining hosts for this loop 27712 1727096503.17621: getting the next task for host managed_node2 27712 1727096503.17627: done getting next task for host managed_node2 27712 1727096503.17630: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 27712 1727096503.17637: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096503.17646: getting variables 27712 1727096503.17648: in VariableManager get_vars() 27712 1727096503.17685: Calling all_inventory to load vars for managed_node2 27712 1727096503.17688: Calling groups_inventory to load vars for managed_node2 27712 1727096503.17691: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096503.17700: Calling all_plugins_play to load vars for managed_node2 27712 1727096503.17703: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096503.17706: Calling groups_plugins_play to load vars for managed_node2 27712 1727096503.19154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096503.20807: done with get_vars() 27712 1727096503.20828: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 09:01:43 -0400 (0:00:01.858) 0:00:28.902 ****** 27712 1727096503.20927: entering _queue_task() for managed_node2/package_facts 27712 1727096503.21251: worker is 1 (out of 1 available) 27712 1727096503.21262: exiting _queue_task() for managed_node2/package_facts 27712 1727096503.21276: done queuing things up, now waiting for results queue to drain 27712 1727096503.21278: waiting for pending results... 27712 1727096503.21794: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 27712 1727096503.21800: in run() - task 0afff68d-5257-cbc7-8716-0000000006ad 27712 1727096503.21803: variable 'ansible_search_path' from source: unknown 27712 1727096503.21806: variable 'ansible_search_path' from source: unknown 27712 1727096503.21809: calling self._execute() 27712 1727096503.21881: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096503.21896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096503.21911: variable 'omit' from source: magic vars 27712 1727096503.22281: variable 'ansible_distribution_major_version' from source: facts 27712 1727096503.22300: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096503.22312: variable 'omit' from source: magic vars 27712 1727096503.22397: variable 'omit' from source: magic vars 27712 1727096503.22438: variable 'omit' from source: magic vars 27712 1727096503.22488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096503.22530: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096503.22652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096503.22656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096503.22659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096503.22661: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096503.22663: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096503.22666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096503.22746: Set connection var ansible_connection to ssh 27712 1727096503.22761: Set connection var ansible_pipelining to False 27712 1727096503.22774: Set connection var ansible_timeout to 10 27712 1727096503.22787: Set connection var ansible_shell_type to sh 27712 1727096503.22800: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096503.22809: Set connection var ansible_shell_executable to /bin/sh 27712 1727096503.22838: variable 'ansible_shell_executable' from source: unknown 27712 1727096503.22845: variable 'ansible_connection' from source: unknown 27712 1727096503.22853: variable 'ansible_module_compression' from source: unknown 27712 1727096503.22860: variable 'ansible_shell_type' from source: unknown 27712 1727096503.22891: variable 'ansible_shell_executable' from source: unknown 27712 1727096503.22894: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096503.22897: variable 'ansible_pipelining' from source: unknown 27712 1727096503.22899: variable 'ansible_timeout' from source: unknown 27712 1727096503.22901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096503.23101: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096503.23276: variable 'omit' from source: magic vars 27712 1727096503.23279: starting attempt loop 27712 1727096503.23281: running the handler 27712 1727096503.23284: _low_level_execute_command(): starting 27712 1727096503.23286: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096503.23872: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096503.23890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096503.23905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096503.23958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096503.23974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096503.24032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096503.25909: stdout chunk (state=3): >>>/root <<< 27712 1727096503.26042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096503.26059: stdout chunk (state=3): >>><<< 27712 1727096503.26084: stderr chunk (state=3): >>><<< 27712 1727096503.26110: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096503.26131: _low_level_execute_command(): starting 27712 1727096503.26142: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096503.2611766-29064-117137183052701 `" && echo ansible-tmp-1727096503.2611766-29064-117137183052701="` echo /root/.ansible/tmp/ansible-tmp-1727096503.2611766-29064-117137183052701 `" ) && sleep 0' 27712 1727096503.26757: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096503.26774: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096503.26791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096503.26807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096503.26916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096503.26954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096503.26985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096503.27158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096503.28942: stdout chunk (state=3): >>>ansible-tmp-1727096503.2611766-29064-117137183052701=/root/.ansible/tmp/ansible-tmp-1727096503.2611766-29064-117137183052701 <<< 27712 1727096503.29051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096503.29106: stderr chunk (state=3): >>><<< 27712 1727096503.29123: stdout chunk (state=3): >>><<< 27712 1727096503.29151: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096503.2611766-29064-117137183052701=/root/.ansible/tmp/ansible-tmp-1727096503.2611766-29064-117137183052701 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096503.29208: variable 'ansible_module_compression' from source: unknown 27712 1727096503.29261: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 27712 1727096503.29326: variable 'ansible_facts' from source: unknown 27712 1727096503.29517: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096503.2611766-29064-117137183052701/AnsiballZ_package_facts.py 27712 1727096503.29766: Sending initial data 27712 1727096503.29785: Sent initial data (162 bytes) 27712 1727096503.30658: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096503.30701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096503.30716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096503.30744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096503.30811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096503.32509: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096503.32513: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096503.2611766-29064-117137183052701/AnsiballZ_package_facts.py" <<< 27712 1727096503.32516: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpswskuf__ /root/.ansible/tmp/ansible-tmp-1727096503.2611766-29064-117137183052701/AnsiballZ_package_facts.py <<< 27712 1727096503.32536: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpswskuf__" to remote "/root/.ansible/tmp/ansible-tmp-1727096503.2611766-29064-117137183052701/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096503.2611766-29064-117137183052701/AnsiballZ_package_facts.py" <<< 27712 1727096503.35200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096503.35626: stderr chunk (state=3): >>><<< 27712 1727096503.35634: stdout chunk (state=3): >>><<< 27712 1727096503.35637: done transferring module to remote 27712 1727096503.35639: _low_level_execute_command(): starting 27712 1727096503.35641: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096503.2611766-29064-117137183052701/ /root/.ansible/tmp/ansible-tmp-1727096503.2611766-29064-117137183052701/AnsiballZ_package_facts.py && sleep 0' 27712 1727096503.36597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096503.36883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096503.36919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096503.36923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096503.36958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096503.39189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096503.39192: stdout chunk (state=3): >>><<< 27712 1727096503.39195: stderr chunk (state=3): >>><<< 27712 1727096503.39197: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096503.39205: _low_level_execute_command(): starting 27712 1727096503.39207: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096503.2611766-29064-117137183052701/AnsiballZ_package_facts.py && sleep 0' 27712 1727096503.40351: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096503.40355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096503.40519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096503.40523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096503.40534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096503.40602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096503.85276: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 27712 1727096503.85415: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 27712 1727096503.85446: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 27712 1727096503.85471: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 27712 1727096503.87553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096503.87557: stdout chunk (state=3): >>><<< 27712 1727096503.87559: stderr chunk (state=3): >>><<< 27712 1727096503.87778: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096503.92177: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096503.2611766-29064-117137183052701/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096503.92181: _low_level_execute_command(): starting 27712 1727096503.92184: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096503.2611766-29064-117137183052701/ > /dev/null 2>&1 && sleep 0' 27712 1727096503.93578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096503.93710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096503.93779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096503.95645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096503.95703: stderr chunk (state=3): >>><<< 27712 1727096503.95777: stdout chunk (state=3): >>><<< 27712 1727096503.95797: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096503.95808: handler run complete 27712 1727096503.97736: variable 'ansible_facts' from source: unknown 27712 1727096503.98723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096504.03199: variable 'ansible_facts' from source: unknown 27712 1727096504.04125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096504.05527: attempt loop complete, returning result 27712 1727096504.05545: _execute() done 27712 1727096504.05553: dumping result to json 27712 1727096504.06153: done dumping result, returning 27712 1727096504.06156: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-cbc7-8716-0000000006ad] 27712 1727096504.06159: sending task result for task 0afff68d-5257-cbc7-8716-0000000006ad 27712 1727096504.10142: done sending task result for task 0afff68d-5257-cbc7-8716-0000000006ad 27712 1727096504.10146: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096504.10301: no more pending results, returning what we have 27712 1727096504.10304: results queue empty 27712 1727096504.10305: checking for any_errors_fatal 27712 1727096504.10310: done checking for any_errors_fatal 27712 1727096504.10311: checking for max_fail_percentage 27712 1727096504.10312: done checking for max_fail_percentage 27712 1727096504.10313: checking to see if all hosts have failed and the running result is not ok 27712 1727096504.10314: done checking to see if all hosts have failed 27712 1727096504.10314: getting the remaining hosts for this loop 27712 1727096504.10316: done getting the remaining hosts for this loop 27712 1727096504.10319: getting the next task for host managed_node2 27712 1727096504.10326: done getting next task for host managed_node2 27712 1727096504.10329: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 27712 1727096504.10333: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096504.10344: getting variables 27712 1727096504.10345: in VariableManager get_vars() 27712 1727096504.10380: Calling all_inventory to load vars for managed_node2 27712 1727096504.10383: Calling groups_inventory to load vars for managed_node2 27712 1727096504.10385: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096504.10393: Calling all_plugins_play to load vars for managed_node2 27712 1727096504.10396: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096504.10399: Calling groups_plugins_play to load vars for managed_node2 27712 1727096504.12566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096504.15732: done with get_vars() 27712 1727096504.15755: done getting variables 27712 1727096504.16014: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 09:01:44 -0400 (0:00:00.951) 0:00:29.853 ****** 27712 1727096504.16051: entering _queue_task() for managed_node2/debug 27712 1727096504.16542: worker is 1 (out of 1 available) 27712 1727096504.16554: exiting _queue_task() for managed_node2/debug 27712 1727096504.16565: done queuing things up, now waiting for results queue to drain 27712 1727096504.16566: waiting for pending results... 27712 1727096504.17186: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 27712 1727096504.17191: in run() - task 0afff68d-5257-cbc7-8716-000000000642 27712 1727096504.17195: variable 'ansible_search_path' from source: unknown 27712 1727096504.17198: variable 'ansible_search_path' from source: unknown 27712 1727096504.17396: calling self._execute() 27712 1727096504.17494: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096504.17775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096504.17778: variable 'omit' from source: magic vars 27712 1727096504.18576: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.18581: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096504.18583: variable 'omit' from source: magic vars 27712 1727096504.18585: variable 'omit' from source: magic vars 27712 1727096504.18711: variable 'network_provider' from source: set_fact 27712 1727096504.18733: variable 'omit' from source: magic vars 27712 1727096504.18780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096504.18899: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096504.18927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096504.19015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096504.19030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096504.19061: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096504.19075: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096504.19085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096504.19193: Set connection var ansible_connection to ssh 27712 1727096504.19209: Set connection var ansible_pipelining to False 27712 1727096504.19218: Set connection var ansible_timeout to 10 27712 1727096504.19224: Set connection var ansible_shell_type to sh 27712 1727096504.19235: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096504.19244: Set connection var ansible_shell_executable to /bin/sh 27712 1727096504.19270: variable 'ansible_shell_executable' from source: unknown 27712 1727096504.19283: variable 'ansible_connection' from source: unknown 27712 1727096504.19292: variable 'ansible_module_compression' from source: unknown 27712 1727096504.19298: variable 'ansible_shell_type' from source: unknown 27712 1727096504.19310: variable 'ansible_shell_executable' from source: unknown 27712 1727096504.19317: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096504.19325: variable 'ansible_pipelining' from source: unknown 27712 1727096504.19332: variable 'ansible_timeout' from source: unknown 27712 1727096504.19339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096504.19486: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096504.19503: variable 'omit' from source: magic vars 27712 1727096504.19513: starting attempt loop 27712 1727096504.19525: running the handler 27712 1727096504.19576: handler run complete 27712 1727096504.19635: attempt loop complete, returning result 27712 1727096504.19639: _execute() done 27712 1727096504.19641: dumping result to json 27712 1727096504.19643: done dumping result, returning 27712 1727096504.19646: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-cbc7-8716-000000000642] 27712 1727096504.19648: sending task result for task 0afff68d-5257-cbc7-8716-000000000642 ok: [managed_node2] => {} MSG: Using network provider: nm 27712 1727096504.19795: no more pending results, returning what we have 27712 1727096504.19798: results queue empty 27712 1727096504.19799: checking for any_errors_fatal 27712 1727096504.19810: done checking for any_errors_fatal 27712 1727096504.19810: checking for max_fail_percentage 27712 1727096504.19812: done checking for max_fail_percentage 27712 1727096504.19812: checking to see if all hosts have failed and the running result is not ok 27712 1727096504.19813: done checking to see if all hosts have failed 27712 1727096504.19814: getting the remaining hosts for this loop 27712 1727096504.19815: done getting the remaining hosts for this loop 27712 1727096504.19818: getting the next task for host managed_node2 27712 1727096504.19824: done getting next task for host managed_node2 27712 1727096504.19827: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27712 1727096504.19832: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096504.19843: getting variables 27712 1727096504.19844: in VariableManager get_vars() 27712 1727096504.19887: Calling all_inventory to load vars for managed_node2 27712 1727096504.19890: Calling groups_inventory to load vars for managed_node2 27712 1727096504.19892: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096504.19902: Calling all_plugins_play to load vars for managed_node2 27712 1727096504.19905: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096504.19908: Calling groups_plugins_play to load vars for managed_node2 27712 1727096504.20427: done sending task result for task 0afff68d-5257-cbc7-8716-000000000642 27712 1727096504.20432: WORKER PROCESS EXITING 27712 1727096504.22750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096504.25471: done with get_vars() 27712 1727096504.25499: done getting variables 27712 1727096504.25557: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 09:01:44 -0400 (0:00:00.095) 0:00:29.949 ****** 27712 1727096504.25601: entering _queue_task() for managed_node2/fail 27712 1727096504.26195: worker is 1 (out of 1 available) 27712 1727096504.26203: exiting _queue_task() for managed_node2/fail 27712 1727096504.26212: done queuing things up, now waiting for results queue to drain 27712 1727096504.26214: waiting for pending results... 27712 1727096504.26275: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 27712 1727096504.26420: in run() - task 0afff68d-5257-cbc7-8716-000000000643 27712 1727096504.26550: variable 'ansible_search_path' from source: unknown 27712 1727096504.26554: variable 'ansible_search_path' from source: unknown 27712 1727096504.26557: calling self._execute() 27712 1727096504.26603: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096504.26617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096504.26632: variable 'omit' from source: magic vars 27712 1727096504.27031: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.27049: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096504.27177: variable 'network_state' from source: role '' defaults 27712 1727096504.27200: Evaluated conditional (network_state != {}): False 27712 1727096504.27213: when evaluation is False, skipping this task 27712 1727096504.27221: _execute() done 27712 1727096504.27228: dumping result to json 27712 1727096504.27235: done dumping result, returning 27712 1727096504.27247: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-cbc7-8716-000000000643] 27712 1727096504.27257: sending task result for task 0afff68d-5257-cbc7-8716-000000000643 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096504.27579: no more pending results, returning what we have 27712 1727096504.27583: results queue empty 27712 1727096504.27584: checking for any_errors_fatal 27712 1727096504.27593: done checking for any_errors_fatal 27712 1727096504.27594: checking for max_fail_percentage 27712 1727096504.27595: done checking for max_fail_percentage 27712 1727096504.27596: checking to see if all hosts have failed and the running result is not ok 27712 1727096504.27597: done checking to see if all hosts have failed 27712 1727096504.27598: getting the remaining hosts for this loop 27712 1727096504.27599: done getting the remaining hosts for this loop 27712 1727096504.27604: getting the next task for host managed_node2 27712 1727096504.27610: done getting next task for host managed_node2 27712 1727096504.27614: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27712 1727096504.27619: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096504.27646: getting variables 27712 1727096504.27648: in VariableManager get_vars() 27712 1727096504.27696: Calling all_inventory to load vars for managed_node2 27712 1727096504.27699: Calling groups_inventory to load vars for managed_node2 27712 1727096504.27702: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096504.27714: Calling all_plugins_play to load vars for managed_node2 27712 1727096504.27717: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096504.27721: Calling groups_plugins_play to load vars for managed_node2 27712 1727096504.28283: done sending task result for task 0afff68d-5257-cbc7-8716-000000000643 27712 1727096504.28286: WORKER PROCESS EXITING 27712 1727096504.29217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096504.30908: done with get_vars() 27712 1727096504.30931: done getting variables 27712 1727096504.30991: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 09:01:44 -0400 (0:00:00.054) 0:00:30.003 ****** 27712 1727096504.31028: entering _queue_task() for managed_node2/fail 27712 1727096504.31330: worker is 1 (out of 1 available) 27712 1727096504.31458: exiting _queue_task() for managed_node2/fail 27712 1727096504.31472: done queuing things up, now waiting for results queue to drain 27712 1727096504.31473: waiting for pending results... 27712 1727096504.31659: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 27712 1727096504.31881: in run() - task 0afff68d-5257-cbc7-8716-000000000644 27712 1727096504.31887: variable 'ansible_search_path' from source: unknown 27712 1727096504.31890: variable 'ansible_search_path' from source: unknown 27712 1727096504.31893: calling self._execute() 27712 1727096504.31975: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096504.31992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096504.32012: variable 'omit' from source: magic vars 27712 1727096504.32406: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.32427: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096504.32555: variable 'network_state' from source: role '' defaults 27712 1727096504.32573: Evaluated conditional (network_state != {}): False 27712 1727096504.32582: when evaluation is False, skipping this task 27712 1727096504.32647: _execute() done 27712 1727096504.32651: dumping result to json 27712 1727096504.32655: done dumping result, returning 27712 1727096504.32658: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-cbc7-8716-000000000644] 27712 1727096504.32661: sending task result for task 0afff68d-5257-cbc7-8716-000000000644 27712 1727096504.32738: done sending task result for task 0afff68d-5257-cbc7-8716-000000000644 27712 1727096504.32741: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096504.32797: no more pending results, returning what we have 27712 1727096504.32801: results queue empty 27712 1727096504.32803: checking for any_errors_fatal 27712 1727096504.32810: done checking for any_errors_fatal 27712 1727096504.32811: checking for max_fail_percentage 27712 1727096504.32812: done checking for max_fail_percentage 27712 1727096504.32813: checking to see if all hosts have failed and the running result is not ok 27712 1727096504.32814: done checking to see if all hosts have failed 27712 1727096504.32815: getting the remaining hosts for this loop 27712 1727096504.32816: done getting the remaining hosts for this loop 27712 1727096504.32820: getting the next task for host managed_node2 27712 1727096504.32828: done getting next task for host managed_node2 27712 1727096504.32832: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27712 1727096504.32838: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096504.32975: getting variables 27712 1727096504.32977: in VariableManager get_vars() 27712 1727096504.33020: Calling all_inventory to load vars for managed_node2 27712 1727096504.33023: Calling groups_inventory to load vars for managed_node2 27712 1727096504.33025: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096504.33038: Calling all_plugins_play to load vars for managed_node2 27712 1727096504.33042: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096504.33045: Calling groups_plugins_play to load vars for managed_node2 27712 1727096504.35014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096504.37921: done with get_vars() 27712 1727096504.37945: done getting variables 27712 1727096504.38206: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 09:01:44 -0400 (0:00:00.072) 0:00:30.075 ****** 27712 1727096504.38238: entering _queue_task() for managed_node2/fail 27712 1727096504.38741: worker is 1 (out of 1 available) 27712 1727096504.38753: exiting _queue_task() for managed_node2/fail 27712 1727096504.38765: done queuing things up, now waiting for results queue to drain 27712 1727096504.38766: waiting for pending results... 27712 1727096504.39686: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 27712 1727096504.39775: in run() - task 0afff68d-5257-cbc7-8716-000000000645 27712 1727096504.39780: variable 'ansible_search_path' from source: unknown 27712 1727096504.39783: variable 'ansible_search_path' from source: unknown 27712 1727096504.39786: calling self._execute() 27712 1727096504.39837: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096504.39851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096504.39866: variable 'omit' from source: magic vars 27712 1727096504.40435: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.40877: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096504.40883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096504.43381: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096504.43464: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096504.43510: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096504.43556: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096504.43594: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096504.43680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.43716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.43755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.43806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.43827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.43929: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.43953: Evaluated conditional (ansible_distribution_major_version | int > 9): True 27712 1727096504.44084: variable 'ansible_distribution' from source: facts 27712 1727096504.44094: variable '__network_rh_distros' from source: role '' defaults 27712 1727096504.44107: Evaluated conditional (ansible_distribution in __network_rh_distros): True 27712 1727096504.44375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.44409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.44438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.44487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.44510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.44560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.44594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.44716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.44719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.44721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.44738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.44769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.44802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.44848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.44869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.45196: variable 'network_connections' from source: include params 27712 1727096504.45213: variable 'interface0' from source: play vars 27712 1727096504.45293: variable 'interface0' from source: play vars 27712 1727096504.45306: variable 'interface1' from source: play vars 27712 1727096504.45357: variable 'interface1' from source: play vars 27712 1727096504.45379: variable 'network_state' from source: role '' defaults 27712 1727096504.45450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096504.45606: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096504.45675: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096504.45684: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096504.45732: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096504.46189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096504.46201: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096504.46204: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.46207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096504.46209: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 27712 1727096504.46211: when evaluation is False, skipping this task 27712 1727096504.46213: _execute() done 27712 1727096504.46216: dumping result to json 27712 1727096504.46218: done dumping result, returning 27712 1727096504.46220: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-cbc7-8716-000000000645] 27712 1727096504.46222: sending task result for task 0afff68d-5257-cbc7-8716-000000000645 skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 27712 1727096504.46358: no more pending results, returning what we have 27712 1727096504.46363: results queue empty 27712 1727096504.46364: checking for any_errors_fatal 27712 1727096504.46478: done checking for any_errors_fatal 27712 1727096504.46479: checking for max_fail_percentage 27712 1727096504.46482: done checking for max_fail_percentage 27712 1727096504.46482: checking to see if all hosts have failed and the running result is not ok 27712 1727096504.46483: done checking to see if all hosts have failed 27712 1727096504.46484: getting the remaining hosts for this loop 27712 1727096504.46485: done getting the remaining hosts for this loop 27712 1727096504.46489: getting the next task for host managed_node2 27712 1727096504.46496: done getting next task for host managed_node2 27712 1727096504.46500: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27712 1727096504.46505: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096504.46524: getting variables 27712 1727096504.46525: in VariableManager get_vars() 27712 1727096504.46566: Calling all_inventory to load vars for managed_node2 27712 1727096504.46675: Calling groups_inventory to load vars for managed_node2 27712 1727096504.46679: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096504.46690: Calling all_plugins_play to load vars for managed_node2 27712 1727096504.46693: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096504.46695: Calling groups_plugins_play to load vars for managed_node2 27712 1727096504.47792: done sending task result for task 0afff68d-5257-cbc7-8716-000000000645 27712 1727096504.47795: WORKER PROCESS EXITING 27712 1727096504.49945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096504.53124: done with get_vars() 27712 1727096504.53152: done getting variables 27712 1727096504.53418: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 09:01:44 -0400 (0:00:00.152) 0:00:30.227 ****** 27712 1727096504.53453: entering _queue_task() for managed_node2/dnf 27712 1727096504.54260: worker is 1 (out of 1 available) 27712 1727096504.54275: exiting _queue_task() for managed_node2/dnf 27712 1727096504.54288: done queuing things up, now waiting for results queue to drain 27712 1727096504.54289: waiting for pending results... 27712 1727096504.54879: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 27712 1727096504.55139: in run() - task 0afff68d-5257-cbc7-8716-000000000646 27712 1727096504.55163: variable 'ansible_search_path' from source: unknown 27712 1727096504.55177: variable 'ansible_search_path' from source: unknown 27712 1727096504.55220: calling self._execute() 27712 1727096504.55510: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096504.55524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096504.55541: variable 'omit' from source: magic vars 27712 1727096504.56291: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.56303: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096504.56546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096504.58439: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096504.58573: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096504.58577: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096504.58586: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096504.58618: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096504.58699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.58733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.58761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.58806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.58825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.58938: variable 'ansible_distribution' from source: facts 27712 1727096504.58947: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.58973: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 27712 1727096504.59207: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096504.59390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.59414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.59485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.59526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.59541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.59599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.59622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.59662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.59707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.59721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.59758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.59795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.59819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.59856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.59872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.60161: variable 'network_connections' from source: include params 27712 1727096504.60166: variable 'interface0' from source: play vars 27712 1727096504.60171: variable 'interface0' from source: play vars 27712 1727096504.60173: variable 'interface1' from source: play vars 27712 1727096504.60211: variable 'interface1' from source: play vars 27712 1727096504.60312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096504.60681: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096504.60725: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096504.60770: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096504.60874: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096504.60877: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096504.60905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096504.60949: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.60988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096504.61044: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096504.61300: variable 'network_connections' from source: include params 27712 1727096504.61472: variable 'interface0' from source: play vars 27712 1727096504.61475: variable 'interface0' from source: play vars 27712 1727096504.61477: variable 'interface1' from source: play vars 27712 1727096504.61479: variable 'interface1' from source: play vars 27712 1727096504.61491: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27712 1727096504.61500: when evaluation is False, skipping this task 27712 1727096504.61508: _execute() done 27712 1727096504.61516: dumping result to json 27712 1727096504.61524: done dumping result, returning 27712 1727096504.61543: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-cbc7-8716-000000000646] 27712 1727096504.61559: sending task result for task 0afff68d-5257-cbc7-8716-000000000646 27712 1727096504.61684: done sending task result for task 0afff68d-5257-cbc7-8716-000000000646 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27712 1727096504.61752: no more pending results, returning what we have 27712 1727096504.61756: results queue empty 27712 1727096504.61757: checking for any_errors_fatal 27712 1727096504.61765: done checking for any_errors_fatal 27712 1727096504.61766: checking for max_fail_percentage 27712 1727096504.61770: done checking for max_fail_percentage 27712 1727096504.61770: checking to see if all hosts have failed and the running result is not ok 27712 1727096504.61772: done checking to see if all hosts have failed 27712 1727096504.61772: getting the remaining hosts for this loop 27712 1727096504.61774: done getting the remaining hosts for this loop 27712 1727096504.61778: getting the next task for host managed_node2 27712 1727096504.61786: done getting next task for host managed_node2 27712 1727096504.61790: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27712 1727096504.61795: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096504.61819: getting variables 27712 1727096504.61821: in VariableManager get_vars() 27712 1727096504.62172: Calling all_inventory to load vars for managed_node2 27712 1727096504.62175: Calling groups_inventory to load vars for managed_node2 27712 1727096504.62178: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096504.62187: Calling all_plugins_play to load vars for managed_node2 27712 1727096504.62190: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096504.62194: Calling groups_plugins_play to load vars for managed_node2 27712 1727096504.62881: WORKER PROCESS EXITING 27712 1727096504.65054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096504.67011: done with get_vars() 27712 1727096504.67036: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 27712 1727096504.67115: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 09:01:44 -0400 (0:00:00.136) 0:00:30.364 ****** 27712 1727096504.67147: entering _queue_task() for managed_node2/yum 27712 1727096504.67984: worker is 1 (out of 1 available) 27712 1727096504.67999: exiting _queue_task() for managed_node2/yum 27712 1727096504.68013: done queuing things up, now waiting for results queue to drain 27712 1727096504.68015: waiting for pending results... 27712 1727096504.68382: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 27712 1727096504.68524: in run() - task 0afff68d-5257-cbc7-8716-000000000647 27712 1727096504.68537: variable 'ansible_search_path' from source: unknown 27712 1727096504.68541: variable 'ansible_search_path' from source: unknown 27712 1727096504.68585: calling self._execute() 27712 1727096504.68673: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096504.68681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096504.68691: variable 'omit' from source: magic vars 27712 1727096504.69143: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.69146: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096504.69243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096504.71380: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096504.71678: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096504.71708: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096504.71733: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096504.71753: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096504.71840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.71875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.71886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.71935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.72024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.72048: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.72175: Evaluated conditional (ansible_distribution_major_version | int < 8): False 27712 1727096504.72178: when evaluation is False, skipping this task 27712 1727096504.72180: _execute() done 27712 1727096504.72182: dumping result to json 27712 1727096504.72184: done dumping result, returning 27712 1727096504.72186: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-cbc7-8716-000000000647] 27712 1727096504.72188: sending task result for task 0afff68d-5257-cbc7-8716-000000000647 27712 1727096504.72252: done sending task result for task 0afff68d-5257-cbc7-8716-000000000647 27712 1727096504.72254: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 27712 1727096504.72309: no more pending results, returning what we have 27712 1727096504.72312: results queue empty 27712 1727096504.72313: checking for any_errors_fatal 27712 1727096504.72320: done checking for any_errors_fatal 27712 1727096504.72320: checking for max_fail_percentage 27712 1727096504.72322: done checking for max_fail_percentage 27712 1727096504.72323: checking to see if all hosts have failed and the running result is not ok 27712 1727096504.72323: done checking to see if all hosts have failed 27712 1727096504.72324: getting the remaining hosts for this loop 27712 1727096504.72325: done getting the remaining hosts for this loop 27712 1727096504.72329: getting the next task for host managed_node2 27712 1727096504.72335: done getting next task for host managed_node2 27712 1727096504.72339: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27712 1727096504.72343: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096504.72360: getting variables 27712 1727096504.72361: in VariableManager get_vars() 27712 1727096504.72403: Calling all_inventory to load vars for managed_node2 27712 1727096504.72405: Calling groups_inventory to load vars for managed_node2 27712 1727096504.72407: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096504.72415: Calling all_plugins_play to load vars for managed_node2 27712 1727096504.72417: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096504.72420: Calling groups_plugins_play to load vars for managed_node2 27712 1727096504.76898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096504.77743: done with get_vars() 27712 1727096504.77760: done getting variables 27712 1727096504.77798: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 09:01:44 -0400 (0:00:00.106) 0:00:30.471 ****** 27712 1727096504.77818: entering _queue_task() for managed_node2/fail 27712 1727096504.78084: worker is 1 (out of 1 available) 27712 1727096504.78097: exiting _queue_task() for managed_node2/fail 27712 1727096504.78110: done queuing things up, now waiting for results queue to drain 27712 1727096504.78111: waiting for pending results... 27712 1727096504.78298: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 27712 1727096504.78402: in run() - task 0afff68d-5257-cbc7-8716-000000000648 27712 1727096504.78411: variable 'ansible_search_path' from source: unknown 27712 1727096504.78414: variable 'ansible_search_path' from source: unknown 27712 1727096504.78448: calling self._execute() 27712 1727096504.78521: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096504.78525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096504.78533: variable 'omit' from source: magic vars 27712 1727096504.78823: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.78833: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096504.78921: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096504.79050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096504.80542: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096504.80596: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096504.80624: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096504.80651: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096504.80673: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096504.80733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.80755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.80773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.80801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.80811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.80851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.80864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.80884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.80909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.80920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.80949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.80966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.80988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.81011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.81021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.81135: variable 'network_connections' from source: include params 27712 1727096504.81145: variable 'interface0' from source: play vars 27712 1727096504.81203: variable 'interface0' from source: play vars 27712 1727096504.81212: variable 'interface1' from source: play vars 27712 1727096504.81256: variable 'interface1' from source: play vars 27712 1727096504.81311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096504.81434: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096504.81461: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096504.81488: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096504.81510: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096504.81542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096504.81557: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096504.81579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.81597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096504.81638: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096504.81794: variable 'network_connections' from source: include params 27712 1727096504.81797: variable 'interface0' from source: play vars 27712 1727096504.81844: variable 'interface0' from source: play vars 27712 1727096504.81850: variable 'interface1' from source: play vars 27712 1727096504.81896: variable 'interface1' from source: play vars 27712 1727096504.81913: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27712 1727096504.81916: when evaluation is False, skipping this task 27712 1727096504.81920: _execute() done 27712 1727096504.81923: dumping result to json 27712 1727096504.81929: done dumping result, returning 27712 1727096504.81939: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-cbc7-8716-000000000648] 27712 1727096504.81952: sending task result for task 0afff68d-5257-cbc7-8716-000000000648 27712 1727096504.82028: done sending task result for task 0afff68d-5257-cbc7-8716-000000000648 27712 1727096504.82031: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27712 1727096504.82094: no more pending results, returning what we have 27712 1727096504.82097: results queue empty 27712 1727096504.82097: checking for any_errors_fatal 27712 1727096504.82104: done checking for any_errors_fatal 27712 1727096504.82104: checking for max_fail_percentage 27712 1727096504.82106: done checking for max_fail_percentage 27712 1727096504.82107: checking to see if all hosts have failed and the running result is not ok 27712 1727096504.82107: done checking to see if all hosts have failed 27712 1727096504.82108: getting the remaining hosts for this loop 27712 1727096504.82109: done getting the remaining hosts for this loop 27712 1727096504.82113: getting the next task for host managed_node2 27712 1727096504.82120: done getting next task for host managed_node2 27712 1727096504.82123: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 27712 1727096504.82127: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096504.82145: getting variables 27712 1727096504.82146: in VariableManager get_vars() 27712 1727096504.82199: Calling all_inventory to load vars for managed_node2 27712 1727096504.82202: Calling groups_inventory to load vars for managed_node2 27712 1727096504.82204: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096504.82213: Calling all_plugins_play to load vars for managed_node2 27712 1727096504.82216: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096504.82218: Calling groups_plugins_play to load vars for managed_node2 27712 1727096504.83013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096504.83886: done with get_vars() 27712 1727096504.83902: done getting variables 27712 1727096504.83945: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 09:01:44 -0400 (0:00:00.061) 0:00:30.533 ****** 27712 1727096504.83972: entering _queue_task() for managed_node2/package 27712 1727096504.84216: worker is 1 (out of 1 available) 27712 1727096504.84231: exiting _queue_task() for managed_node2/package 27712 1727096504.84242: done queuing things up, now waiting for results queue to drain 27712 1727096504.84244: waiting for pending results... 27712 1727096504.84427: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 27712 1727096504.84521: in run() - task 0afff68d-5257-cbc7-8716-000000000649 27712 1727096504.84532: variable 'ansible_search_path' from source: unknown 27712 1727096504.84536: variable 'ansible_search_path' from source: unknown 27712 1727096504.84564: calling self._execute() 27712 1727096504.84644: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096504.84649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096504.84657: variable 'omit' from source: magic vars 27712 1727096504.84938: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.84948: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096504.85081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096504.85270: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096504.85305: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096504.85329: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096504.85387: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096504.85464: variable 'network_packages' from source: role '' defaults 27712 1727096504.85536: variable '__network_provider_setup' from source: role '' defaults 27712 1727096504.85544: variable '__network_service_name_default_nm' from source: role '' defaults 27712 1727096504.85599: variable '__network_service_name_default_nm' from source: role '' defaults 27712 1727096504.85607: variable '__network_packages_default_nm' from source: role '' defaults 27712 1727096504.85649: variable '__network_packages_default_nm' from source: role '' defaults 27712 1727096504.85762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096504.87326: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096504.87362: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096504.87391: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096504.87414: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096504.87435: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096504.87494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.87514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.87532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.87561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.87575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.87604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.87620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.87637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.87665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.87678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.87818: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27712 1727096504.87955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.87959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.87962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.87996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.88000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.88093: variable 'ansible_python' from source: facts 27712 1727096504.88192: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27712 1727096504.88196: variable '__network_wpa_supplicant_required' from source: role '' defaults 27712 1727096504.88262: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27712 1727096504.88408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.88442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.88559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.88562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.88564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.88594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096504.88629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096504.88658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.88717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096504.88777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096504.88901: variable 'network_connections' from source: include params 27712 1727096504.88914: variable 'interface0' from source: play vars 27712 1727096504.88998: variable 'interface0' from source: play vars 27712 1727096504.89013: variable 'interface1' from source: play vars 27712 1727096504.89083: variable 'interface1' from source: play vars 27712 1727096504.89141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096504.89159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096504.89186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096504.89207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096504.89246: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096504.89425: variable 'network_connections' from source: include params 27712 1727096504.89430: variable 'interface0' from source: play vars 27712 1727096504.89503: variable 'interface0' from source: play vars 27712 1727096504.89511: variable 'interface1' from source: play vars 27712 1727096504.89584: variable 'interface1' from source: play vars 27712 1727096504.89606: variable '__network_packages_default_wireless' from source: role '' defaults 27712 1727096504.89660: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096504.89852: variable 'network_connections' from source: include params 27712 1727096504.89855: variable 'interface0' from source: play vars 27712 1727096504.89907: variable 'interface0' from source: play vars 27712 1727096504.89913: variable 'interface1' from source: play vars 27712 1727096504.89957: variable 'interface1' from source: play vars 27712 1727096504.89977: variable '__network_packages_default_team' from source: role '' defaults 27712 1727096504.90033: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096504.90225: variable 'network_connections' from source: include params 27712 1727096504.90228: variable 'interface0' from source: play vars 27712 1727096504.90277: variable 'interface0' from source: play vars 27712 1727096504.90284: variable 'interface1' from source: play vars 27712 1727096504.90332: variable 'interface1' from source: play vars 27712 1727096504.90369: variable '__network_service_name_default_initscripts' from source: role '' defaults 27712 1727096504.90412: variable '__network_service_name_default_initscripts' from source: role '' defaults 27712 1727096504.90418: variable '__network_packages_default_initscripts' from source: role '' defaults 27712 1727096504.90461: variable '__network_packages_default_initscripts' from source: role '' defaults 27712 1727096504.90600: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27712 1727096504.91160: variable 'network_connections' from source: include params 27712 1727096504.91163: variable 'interface0' from source: play vars 27712 1727096504.91165: variable 'interface0' from source: play vars 27712 1727096504.91169: variable 'interface1' from source: play vars 27712 1727096504.91171: variable 'interface1' from source: play vars 27712 1727096504.91173: variable 'ansible_distribution' from source: facts 27712 1727096504.91175: variable '__network_rh_distros' from source: role '' defaults 27712 1727096504.91177: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.91179: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27712 1727096504.91380: variable 'ansible_distribution' from source: facts 27712 1727096504.91383: variable '__network_rh_distros' from source: role '' defaults 27712 1727096504.91386: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.91388: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27712 1727096504.91444: variable 'ansible_distribution' from source: facts 27712 1727096504.91448: variable '__network_rh_distros' from source: role '' defaults 27712 1727096504.91453: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.91486: variable 'network_provider' from source: set_fact 27712 1727096504.91500: variable 'ansible_facts' from source: unknown 27712 1727096504.92133: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 27712 1727096504.92137: when evaluation is False, skipping this task 27712 1727096504.92140: _execute() done 27712 1727096504.92142: dumping result to json 27712 1727096504.92144: done dumping result, returning 27712 1727096504.92153: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-cbc7-8716-000000000649] 27712 1727096504.92158: sending task result for task 0afff68d-5257-cbc7-8716-000000000649 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 27712 1727096504.92312: no more pending results, returning what we have 27712 1727096504.92316: results queue empty 27712 1727096504.92317: checking for any_errors_fatal 27712 1727096504.92323: done checking for any_errors_fatal 27712 1727096504.92323: checking for max_fail_percentage 27712 1727096504.92325: done checking for max_fail_percentage 27712 1727096504.92326: checking to see if all hosts have failed and the running result is not ok 27712 1727096504.92326: done checking to see if all hosts have failed 27712 1727096504.92327: getting the remaining hosts for this loop 27712 1727096504.92328: done getting the remaining hosts for this loop 27712 1727096504.92332: getting the next task for host managed_node2 27712 1727096504.92338: done getting next task for host managed_node2 27712 1727096504.92342: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27712 1727096504.92346: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096504.92370: getting variables 27712 1727096504.92372: in VariableManager get_vars() 27712 1727096504.92412: Calling all_inventory to load vars for managed_node2 27712 1727096504.92414: Calling groups_inventory to load vars for managed_node2 27712 1727096504.92417: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096504.92426: Calling all_plugins_play to load vars for managed_node2 27712 1727096504.92428: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096504.92431: Calling groups_plugins_play to load vars for managed_node2 27712 1727096504.92983: done sending task result for task 0afff68d-5257-cbc7-8716-000000000649 27712 1727096504.92986: WORKER PROCESS EXITING 27712 1727096504.94032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096504.95635: done with get_vars() 27712 1727096504.95660: done getting variables 27712 1727096504.95729: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 09:01:44 -0400 (0:00:00.117) 0:00:30.650 ****** 27712 1727096504.95764: entering _queue_task() for managed_node2/package 27712 1727096504.96113: worker is 1 (out of 1 available) 27712 1727096504.96125: exiting _queue_task() for managed_node2/package 27712 1727096504.96138: done queuing things up, now waiting for results queue to drain 27712 1727096504.96140: waiting for pending results... 27712 1727096504.96499: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 27712 1727096504.96616: in run() - task 0afff68d-5257-cbc7-8716-00000000064a 27712 1727096504.96636: variable 'ansible_search_path' from source: unknown 27712 1727096504.96645: variable 'ansible_search_path' from source: unknown 27712 1727096504.96689: calling self._execute() 27712 1727096504.96798: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096504.96922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096504.96926: variable 'omit' from source: magic vars 27712 1727096504.97231: variable 'ansible_distribution_major_version' from source: facts 27712 1727096504.97256: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096504.97386: variable 'network_state' from source: role '' defaults 27712 1727096504.97402: Evaluated conditional (network_state != {}): False 27712 1727096504.97410: when evaluation is False, skipping this task 27712 1727096504.97418: _execute() done 27712 1727096504.97425: dumping result to json 27712 1727096504.97434: done dumping result, returning 27712 1727096504.97448: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-cbc7-8716-00000000064a] 27712 1727096504.97466: sending task result for task 0afff68d-5257-cbc7-8716-00000000064a skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096504.97735: no more pending results, returning what we have 27712 1727096504.97739: results queue empty 27712 1727096504.97740: checking for any_errors_fatal 27712 1727096504.97749: done checking for any_errors_fatal 27712 1727096504.97750: checking for max_fail_percentage 27712 1727096504.97752: done checking for max_fail_percentage 27712 1727096504.97753: checking to see if all hosts have failed and the running result is not ok 27712 1727096504.97754: done checking to see if all hosts have failed 27712 1727096504.97755: getting the remaining hosts for this loop 27712 1727096504.97756: done getting the remaining hosts for this loop 27712 1727096504.97760: getting the next task for host managed_node2 27712 1727096504.97771: done getting next task for host managed_node2 27712 1727096504.97775: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27712 1727096504.97780: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096504.97807: getting variables 27712 1727096504.97809: in VariableManager get_vars() 27712 1727096504.97855: Calling all_inventory to load vars for managed_node2 27712 1727096504.97858: Calling groups_inventory to load vars for managed_node2 27712 1727096504.97861: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096504.98012: Calling all_plugins_play to load vars for managed_node2 27712 1727096504.98016: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096504.98020: Calling groups_plugins_play to load vars for managed_node2 27712 1727096504.98679: done sending task result for task 0afff68d-5257-cbc7-8716-00000000064a 27712 1727096504.98683: WORKER PROCESS EXITING 27712 1727096504.99347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096505.01069: done with get_vars() 27712 1727096505.01091: done getting variables 27712 1727096505.01145: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 09:01:45 -0400 (0:00:00.054) 0:00:30.705 ****** 27712 1727096505.01177: entering _queue_task() for managed_node2/package 27712 1727096505.01506: worker is 1 (out of 1 available) 27712 1727096505.01518: exiting _queue_task() for managed_node2/package 27712 1727096505.01642: done queuing things up, now waiting for results queue to drain 27712 1727096505.01643: waiting for pending results... 27712 1727096505.01822: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 27712 1727096505.01958: in run() - task 0afff68d-5257-cbc7-8716-00000000064b 27712 1727096505.01986: variable 'ansible_search_path' from source: unknown 27712 1727096505.01994: variable 'ansible_search_path' from source: unknown 27712 1727096505.02032: calling self._execute() 27712 1727096505.02137: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096505.02149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096505.02163: variable 'omit' from source: magic vars 27712 1727096505.02583: variable 'ansible_distribution_major_version' from source: facts 27712 1727096505.02601: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096505.02737: variable 'network_state' from source: role '' defaults 27712 1727096505.02754: Evaluated conditional (network_state != {}): False 27712 1727096505.02762: when evaluation is False, skipping this task 27712 1727096505.02771: _execute() done 27712 1727096505.02778: dumping result to json 27712 1727096505.02785: done dumping result, returning 27712 1727096505.02796: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-cbc7-8716-00000000064b] 27712 1727096505.02805: sending task result for task 0afff68d-5257-cbc7-8716-00000000064b skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096505.02989: no more pending results, returning what we have 27712 1727096505.02993: results queue empty 27712 1727096505.02994: checking for any_errors_fatal 27712 1727096505.03001: done checking for any_errors_fatal 27712 1727096505.03002: checking for max_fail_percentage 27712 1727096505.03004: done checking for max_fail_percentage 27712 1727096505.03005: checking to see if all hosts have failed and the running result is not ok 27712 1727096505.03006: done checking to see if all hosts have failed 27712 1727096505.03006: getting the remaining hosts for this loop 27712 1727096505.03008: done getting the remaining hosts for this loop 27712 1727096505.03011: getting the next task for host managed_node2 27712 1727096505.03020: done getting next task for host managed_node2 27712 1727096505.03023: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27712 1727096505.03028: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096505.03052: getting variables 27712 1727096505.03053: in VariableManager get_vars() 27712 1727096505.03097: Calling all_inventory to load vars for managed_node2 27712 1727096505.03100: Calling groups_inventory to load vars for managed_node2 27712 1727096505.03102: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096505.03114: Calling all_plugins_play to load vars for managed_node2 27712 1727096505.03117: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096505.03120: Calling groups_plugins_play to load vars for managed_node2 27712 1727096505.03783: done sending task result for task 0afff68d-5257-cbc7-8716-00000000064b 27712 1727096505.03787: WORKER PROCESS EXITING 27712 1727096505.04725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096505.06859: done with get_vars() 27712 1727096505.06888: done getting variables 27712 1727096505.06949: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 09:01:45 -0400 (0:00:00.058) 0:00:30.763 ****** 27712 1727096505.06987: entering _queue_task() for managed_node2/service 27712 1727096505.07327: worker is 1 (out of 1 available) 27712 1727096505.07341: exiting _queue_task() for managed_node2/service 27712 1727096505.07352: done queuing things up, now waiting for results queue to drain 27712 1727096505.07354: waiting for pending results... 27712 1727096505.07651: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 27712 1727096505.07795: in run() - task 0afff68d-5257-cbc7-8716-00000000064c 27712 1727096505.07816: variable 'ansible_search_path' from source: unknown 27712 1727096505.07825: variable 'ansible_search_path' from source: unknown 27712 1727096505.07870: calling self._execute() 27712 1727096505.07984: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096505.07998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096505.08019: variable 'omit' from source: magic vars 27712 1727096505.08439: variable 'ansible_distribution_major_version' from source: facts 27712 1727096505.08458: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096505.08583: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096505.08779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096505.10880: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096505.10959: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096505.11001: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096505.11039: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096505.11076: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096505.11149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096505.11187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096505.11220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096505.11264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096505.11286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096505.11338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096505.11372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096505.11411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096505.11444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096505.11572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096505.11575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096505.11578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096505.11580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096505.11601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096505.11621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096505.11799: variable 'network_connections' from source: include params 27712 1727096505.11819: variable 'interface0' from source: play vars 27712 1727096505.11894: variable 'interface0' from source: play vars 27712 1727096505.11915: variable 'interface1' from source: play vars 27712 1727096505.11981: variable 'interface1' from source: play vars 27712 1727096505.12060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096505.12580: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096505.12625: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096505.12662: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096505.12773: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096505.12778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096505.12780: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096505.12802: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096505.12831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096505.12885: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096505.13101: variable 'network_connections' from source: include params 27712 1727096505.13173: variable 'interface0' from source: play vars 27712 1727096505.13182: variable 'interface0' from source: play vars 27712 1727096505.13193: variable 'interface1' from source: play vars 27712 1727096505.13259: variable 'interface1' from source: play vars 27712 1727096505.13288: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 27712 1727096505.13295: when evaluation is False, skipping this task 27712 1727096505.13301: _execute() done 27712 1727096505.13306: dumping result to json 27712 1727096505.13312: done dumping result, returning 27712 1727096505.13320: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-cbc7-8716-00000000064c] 27712 1727096505.13343: sending task result for task 0afff68d-5257-cbc7-8716-00000000064c skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 27712 1727096505.13484: no more pending results, returning what we have 27712 1727096505.13488: results queue empty 27712 1727096505.13489: checking for any_errors_fatal 27712 1727096505.13494: done checking for any_errors_fatal 27712 1727096505.13495: checking for max_fail_percentage 27712 1727096505.13497: done checking for max_fail_percentage 27712 1727096505.13498: checking to see if all hosts have failed and the running result is not ok 27712 1727096505.13499: done checking to see if all hosts have failed 27712 1727096505.13499: getting the remaining hosts for this loop 27712 1727096505.13501: done getting the remaining hosts for this loop 27712 1727096505.13505: getting the next task for host managed_node2 27712 1727096505.13512: done getting next task for host managed_node2 27712 1727096505.13516: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27712 1727096505.13520: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096505.13538: getting variables 27712 1727096505.13540: in VariableManager get_vars() 27712 1727096505.13583: Calling all_inventory to load vars for managed_node2 27712 1727096505.13586: Calling groups_inventory to load vars for managed_node2 27712 1727096505.13589: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096505.13599: Calling all_plugins_play to load vars for managed_node2 27712 1727096505.13602: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096505.13605: Calling groups_plugins_play to load vars for managed_node2 27712 1727096505.14484: done sending task result for task 0afff68d-5257-cbc7-8716-00000000064c 27712 1727096505.14488: WORKER PROCESS EXITING 27712 1727096505.15401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096505.16948: done with get_vars() 27712 1727096505.16971: done getting variables 27712 1727096505.17024: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 09:01:45 -0400 (0:00:00.100) 0:00:30.863 ****** 27712 1727096505.17053: entering _queue_task() for managed_node2/service 27712 1727096505.17336: worker is 1 (out of 1 available) 27712 1727096505.17347: exiting _queue_task() for managed_node2/service 27712 1727096505.17358: done queuing things up, now waiting for results queue to drain 27712 1727096505.17360: waiting for pending results... 27712 1727096505.17735: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 27712 1727096505.17995: in run() - task 0afff68d-5257-cbc7-8716-00000000064d 27712 1727096505.18014: variable 'ansible_search_path' from source: unknown 27712 1727096505.18022: variable 'ansible_search_path' from source: unknown 27712 1727096505.18118: calling self._execute() 27712 1727096505.18325: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096505.18336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096505.18349: variable 'omit' from source: magic vars 27712 1727096505.19048: variable 'ansible_distribution_major_version' from source: facts 27712 1727096505.19281: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096505.19542: variable 'network_provider' from source: set_fact 27712 1727096505.19651: variable 'network_state' from source: role '' defaults 27712 1727096505.19654: Evaluated conditional (network_provider == "nm" or network_state != {}): True 27712 1727096505.19656: variable 'omit' from source: magic vars 27712 1727096505.19659: variable 'omit' from source: magic vars 27712 1727096505.19661: variable 'network_service_name' from source: role '' defaults 27712 1727096505.19736: variable 'network_service_name' from source: role '' defaults 27712 1727096505.19971: variable '__network_provider_setup' from source: role '' defaults 27712 1727096505.20082: variable '__network_service_name_default_nm' from source: role '' defaults 27712 1727096505.20151: variable '__network_service_name_default_nm' from source: role '' defaults 27712 1727096505.20209: variable '__network_packages_default_nm' from source: role '' defaults 27712 1727096505.20416: variable '__network_packages_default_nm' from source: role '' defaults 27712 1727096505.20852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096505.23656: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096505.23731: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096505.23771: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096505.23809: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096505.23836: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096505.23920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096505.23954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096505.23983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096505.24032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096505.24053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096505.24108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096505.24137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096505.24169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096505.24218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096505.24238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096505.24469: variable '__network_packages_default_gobject_packages' from source: role '' defaults 27712 1727096505.24586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096505.24615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096505.24647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096505.24695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096505.24715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096505.24809: variable 'ansible_python' from source: facts 27712 1727096505.24835: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 27712 1727096505.24919: variable '__network_wpa_supplicant_required' from source: role '' defaults 27712 1727096505.25005: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27712 1727096505.25134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096505.25164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096505.25200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096505.25242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096505.25262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096505.25321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096505.25417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096505.25420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096505.25430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096505.25451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096505.25596: variable 'network_connections' from source: include params 27712 1727096505.25609: variable 'interface0' from source: play vars 27712 1727096505.25688: variable 'interface0' from source: play vars 27712 1727096505.25706: variable 'interface1' from source: play vars 27712 1727096505.25786: variable 'interface1' from source: play vars 27712 1727096505.25898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096505.26091: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096505.26176: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096505.26192: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096505.26235: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096505.26302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096505.26335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096505.26375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096505.26474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096505.26477: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096505.26757: variable 'network_connections' from source: include params 27712 1727096505.26771: variable 'interface0' from source: play vars 27712 1727096505.26850: variable 'interface0' from source: play vars 27712 1727096505.26866: variable 'interface1' from source: play vars 27712 1727096505.26944: variable 'interface1' from source: play vars 27712 1727096505.26983: variable '__network_packages_default_wireless' from source: role '' defaults 27712 1727096505.27068: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096505.27361: variable 'network_connections' from source: include params 27712 1727096505.27474: variable 'interface0' from source: play vars 27712 1727096505.27478: variable 'interface0' from source: play vars 27712 1727096505.27481: variable 'interface1' from source: play vars 27712 1727096505.27534: variable 'interface1' from source: play vars 27712 1727096505.27561: variable '__network_packages_default_team' from source: role '' defaults 27712 1727096505.27648: variable '__network_team_connections_defined' from source: role '' defaults 27712 1727096505.27954: variable 'network_connections' from source: include params 27712 1727096505.27963: variable 'interface0' from source: play vars 27712 1727096505.28039: variable 'interface0' from source: play vars 27712 1727096505.28051: variable 'interface1' from source: play vars 27712 1727096505.28140: variable 'interface1' from source: play vars 27712 1727096505.28183: variable '__network_service_name_default_initscripts' from source: role '' defaults 27712 1727096505.28249: variable '__network_service_name_default_initscripts' from source: role '' defaults 27712 1727096505.28358: variable '__network_packages_default_initscripts' from source: role '' defaults 27712 1727096505.28361: variable '__network_packages_default_initscripts' from source: role '' defaults 27712 1727096505.28543: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 27712 1727096505.29051: variable 'network_connections' from source: include params 27712 1727096505.29062: variable 'interface0' from source: play vars 27712 1727096505.29127: variable 'interface0' from source: play vars 27712 1727096505.29140: variable 'interface1' from source: play vars 27712 1727096505.29194: variable 'interface1' from source: play vars 27712 1727096505.29207: variable 'ansible_distribution' from source: facts 27712 1727096505.29215: variable '__network_rh_distros' from source: role '' defaults 27712 1727096505.29230: variable 'ansible_distribution_major_version' from source: facts 27712 1727096505.29248: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 27712 1727096505.29428: variable 'ansible_distribution' from source: facts 27712 1727096505.29440: variable '__network_rh_distros' from source: role '' defaults 27712 1727096505.29455: variable 'ansible_distribution_major_version' from source: facts 27712 1727096505.29474: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 27712 1727096505.29653: variable 'ansible_distribution' from source: facts 27712 1727096505.29666: variable '__network_rh_distros' from source: role '' defaults 27712 1727096505.29678: variable 'ansible_distribution_major_version' from source: facts 27712 1727096505.29774: variable 'network_provider' from source: set_fact 27712 1727096505.29777: variable 'omit' from source: magic vars 27712 1727096505.29779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096505.29782: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096505.29804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096505.29824: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096505.29838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096505.29872: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096505.29885: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096505.29892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096505.29994: Set connection var ansible_connection to ssh 27712 1727096505.30006: Set connection var ansible_pipelining to False 27712 1727096505.30014: Set connection var ansible_timeout to 10 27712 1727096505.30019: Set connection var ansible_shell_type to sh 27712 1727096505.30028: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096505.30035: Set connection var ansible_shell_executable to /bin/sh 27712 1727096505.30058: variable 'ansible_shell_executable' from source: unknown 27712 1727096505.30064: variable 'ansible_connection' from source: unknown 27712 1727096505.30098: variable 'ansible_module_compression' from source: unknown 27712 1727096505.30100: variable 'ansible_shell_type' from source: unknown 27712 1727096505.30102: variable 'ansible_shell_executable' from source: unknown 27712 1727096505.30107: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096505.30109: variable 'ansible_pipelining' from source: unknown 27712 1727096505.30110: variable 'ansible_timeout' from source: unknown 27712 1727096505.30112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096505.30371: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096505.30376: variable 'omit' from source: magic vars 27712 1727096505.30379: starting attempt loop 27712 1727096505.30381: running the handler 27712 1727096505.30383: variable 'ansible_facts' from source: unknown 27712 1727096505.31112: _low_level_execute_command(): starting 27712 1727096505.31125: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096505.31885: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096505.31937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096505.31962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096505.31979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096505.32094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096505.33799: stdout chunk (state=3): >>>/root <<< 27712 1727096505.33935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096505.33946: stdout chunk (state=3): >>><<< 27712 1727096505.33958: stderr chunk (state=3): >>><<< 27712 1727096505.33984: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096505.34261: _low_level_execute_command(): starting 27712 1727096505.34264: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096505.34175-29139-76584638395839 `" && echo ansible-tmp-1727096505.34175-29139-76584638395839="` echo /root/.ansible/tmp/ansible-tmp-1727096505.34175-29139-76584638395839 `" ) && sleep 0' 27712 1727096505.35385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096505.35615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096505.35692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096505.35760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096505.37700: stdout chunk (state=3): >>>ansible-tmp-1727096505.34175-29139-76584638395839=/root/.ansible/tmp/ansible-tmp-1727096505.34175-29139-76584638395839 <<< 27712 1727096505.37889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096505.37933: stderr chunk (state=3): >>><<< 27712 1727096505.37983: stdout chunk (state=3): >>><<< 27712 1727096505.38006: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096505.34175-29139-76584638395839=/root/.ansible/tmp/ansible-tmp-1727096505.34175-29139-76584638395839 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096505.38037: variable 'ansible_module_compression' from source: unknown 27712 1727096505.38096: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 27712 1727096505.38161: variable 'ansible_facts' from source: unknown 27712 1727096505.38781: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096505.34175-29139-76584638395839/AnsiballZ_systemd.py 27712 1727096505.39372: Sending initial data 27712 1727096505.39376: Sent initial data (153 bytes) 27712 1727096505.40090: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096505.40097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096505.40114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096505.40122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096505.40284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096505.40302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096505.40305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096505.40400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096505.41991: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096505.42057: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096505.42061: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmp65mi_h2u /root/.ansible/tmp/ansible-tmp-1727096505.34175-29139-76584638395839/AnsiballZ_systemd.py <<< 27712 1727096505.42133: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096505.34175-29139-76584638395839/AnsiballZ_systemd.py" <<< 27712 1727096505.42214: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmp65mi_h2u" to remote "/root/.ansible/tmp/ansible-tmp-1727096505.34175-29139-76584638395839/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096505.34175-29139-76584638395839/AnsiballZ_systemd.py" <<< 27712 1727096505.45401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096505.45404: stderr chunk (state=3): >>><<< 27712 1727096505.45406: stdout chunk (state=3): >>><<< 27712 1727096505.45408: done transferring module to remote 27712 1727096505.45410: _low_level_execute_command(): starting 27712 1727096505.45412: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096505.34175-29139-76584638395839/ /root/.ansible/tmp/ansible-tmp-1727096505.34175-29139-76584638395839/AnsiballZ_systemd.py && sleep 0' 27712 1727096505.46279: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096505.46292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096505.46305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096505.46321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096505.46385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096505.46427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096505.46448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096505.46509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096505.46521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096505.48938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096505.48941: stdout chunk (state=3): >>><<< 27712 1727096505.48948: stderr chunk (state=3): >>><<< 27712 1727096505.48971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096505.48976: _low_level_execute_command(): starting 27712 1727096505.49021: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096505.34175-29139-76584638395839/AnsiballZ_systemd.py && sleep 0' 27712 1727096505.49631: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096505.49684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096505.49749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096505.49773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096505.49793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096505.49845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096505.79271: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4661248", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3309223936", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1718779000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 27712 1727096505.79308: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 27712 1727096505.81274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096505.81279: stderr chunk (state=3): >>>Shared connection to 10.31.15.126 closed. <<< 27712 1727096505.81281: stdout chunk (state=3): >>><<< 27712 1727096505.81289: stderr chunk (state=3): >>><<< 27712 1727096505.81314: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6933", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainStartTimestampMonotonic": "148866720", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ExecMainHandoffTimestampMonotonic": "148882347", "ExecMainPID": "6933", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4661248", "MemoryPeak": "7372800", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3309223936", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1718779000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target network.target NetworkManager-wait-online.service multi-user.target cloud-init.service", "After": "systemd-journald.socket dbus-broker.service sysinit.target system.slice basic.target cloud-init-local.service dbus.socket network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:25 EDT", "StateChangeTimestampMonotonic": "267537564", "InactiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveExitTimestampMonotonic": "148867374", "ActiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveEnterTimestampMonotonic": "148958112", "ActiveExitTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ActiveExitTimestampMonotonic": "148846079", "InactiveEnterTimestamp": "Mon 2024-09-23 08:53:26 EDT", "InactiveEnterTimestampMonotonic": "148863571", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:53:26 EDT", "ConditionTimestampMonotonic": "148865593", "AssertTimestamp": "Mon 2024-09-23 08:53:26 EDT", "AssertTimestampMonotonic": "148865596", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "1d8325a356394de09dff7606f8803703", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096505.81521: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096505.34175-29139-76584638395839/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096505.81541: _low_level_execute_command(): starting 27712 1727096505.81546: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096505.34175-29139-76584638395839/ > /dev/null 2>&1 && sleep 0' 27712 1727096505.82272: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096505.82276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096505.82278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096505.82280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096505.82282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096505.82284: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27712 1727096505.82414: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096505.82423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096505.82439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096505.84289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096505.84362: stderr chunk (state=3): >>><<< 27712 1727096505.84366: stdout chunk (state=3): >>><<< 27712 1727096505.84577: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096505.84581: handler run complete 27712 1727096505.84583: attempt loop complete, returning result 27712 1727096505.84586: _execute() done 27712 1727096505.84588: dumping result to json 27712 1727096505.84590: done dumping result, returning 27712 1727096505.84592: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-cbc7-8716-00000000064d] 27712 1727096505.84594: sending task result for task 0afff68d-5257-cbc7-8716-00000000064d ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096505.85090: no more pending results, returning what we have 27712 1727096505.85094: results queue empty 27712 1727096505.85095: checking for any_errors_fatal 27712 1727096505.85101: done checking for any_errors_fatal 27712 1727096505.85102: checking for max_fail_percentage 27712 1727096505.85103: done checking for max_fail_percentage 27712 1727096505.85104: checking to see if all hosts have failed and the running result is not ok 27712 1727096505.85105: done checking to see if all hosts have failed 27712 1727096505.85106: getting the remaining hosts for this loop 27712 1727096505.85107: done getting the remaining hosts for this loop 27712 1727096505.85111: getting the next task for host managed_node2 27712 1727096505.85118: done getting next task for host managed_node2 27712 1727096505.85121: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27712 1727096505.85126: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096505.85138: getting variables 27712 1727096505.85140: in VariableManager get_vars() 27712 1727096505.85182: Calling all_inventory to load vars for managed_node2 27712 1727096505.85185: Calling groups_inventory to load vars for managed_node2 27712 1727096505.85188: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096505.85199: Calling all_plugins_play to load vars for managed_node2 27712 1727096505.85202: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096505.85205: Calling groups_plugins_play to load vars for managed_node2 27712 1727096505.85887: done sending task result for task 0afff68d-5257-cbc7-8716-00000000064d 27712 1727096505.85890: WORKER PROCESS EXITING 27712 1727096505.87085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096505.88202: done with get_vars() 27712 1727096505.88221: done getting variables 27712 1727096505.88264: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 09:01:45 -0400 (0:00:00.712) 0:00:31.576 ****** 27712 1727096505.88298: entering _queue_task() for managed_node2/service 27712 1727096505.88537: worker is 1 (out of 1 available) 27712 1727096505.88552: exiting _queue_task() for managed_node2/service 27712 1727096505.88565: done queuing things up, now waiting for results queue to drain 27712 1727096505.88566: waiting for pending results... 27712 1727096505.88746: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 27712 1727096505.88838: in run() - task 0afff68d-5257-cbc7-8716-00000000064e 27712 1727096505.88849: variable 'ansible_search_path' from source: unknown 27712 1727096505.88852: variable 'ansible_search_path' from source: unknown 27712 1727096505.88884: calling self._execute() 27712 1727096505.88957: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096505.88962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096505.88971: variable 'omit' from source: magic vars 27712 1727096505.89260: variable 'ansible_distribution_major_version' from source: facts 27712 1727096505.89263: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096505.89380: variable 'network_provider' from source: set_fact 27712 1727096505.89384: Evaluated conditional (network_provider == "nm"): True 27712 1727096505.89495: variable '__network_wpa_supplicant_required' from source: role '' defaults 27712 1727096505.89557: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 27712 1727096505.89724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096505.91501: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096505.91544: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096505.91571: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096505.91597: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096505.91616: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096505.91678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096505.91699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096505.91718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096505.91744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096505.91755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096505.91794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096505.91809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096505.91826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096505.91850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096505.91860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096505.91895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096505.91910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096505.91926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096505.91949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096505.91959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096505.92054: variable 'network_connections' from source: include params 27712 1727096505.92063: variable 'interface0' from source: play vars 27712 1727096505.92120: variable 'interface0' from source: play vars 27712 1727096505.92129: variable 'interface1' from source: play vars 27712 1727096505.92172: variable 'interface1' from source: play vars 27712 1727096505.92233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 27712 1727096505.92342: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 27712 1727096505.92372: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 27712 1727096505.92408: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 27712 1727096505.92430: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 27712 1727096505.92473: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 27712 1727096505.92541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 27712 1727096505.92544: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096505.92546: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 27712 1727096505.92788: variable '__network_wireless_connections_defined' from source: role '' defaults 27712 1727096505.92823: variable 'network_connections' from source: include params 27712 1727096505.92828: variable 'interface0' from source: play vars 27712 1727096505.92891: variable 'interface0' from source: play vars 27712 1727096505.92906: variable 'interface1' from source: play vars 27712 1727096505.92956: variable 'interface1' from source: play vars 27712 1727096505.92989: Evaluated conditional (__network_wpa_supplicant_required): False 27712 1727096505.92993: when evaluation is False, skipping this task 27712 1727096505.93002: _execute() done 27712 1727096505.93013: dumping result to json 27712 1727096505.93018: done dumping result, returning 27712 1727096505.93021: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-cbc7-8716-00000000064e] 27712 1727096505.93023: sending task result for task 0afff68d-5257-cbc7-8716-00000000064e 27712 1727096505.93106: done sending task result for task 0afff68d-5257-cbc7-8716-00000000064e 27712 1727096505.93109: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 27712 1727096505.93164: no more pending results, returning what we have 27712 1727096505.93168: results queue empty 27712 1727096505.93170: checking for any_errors_fatal 27712 1727096505.93191: done checking for any_errors_fatal 27712 1727096505.93192: checking for max_fail_percentage 27712 1727096505.93194: done checking for max_fail_percentage 27712 1727096505.93195: checking to see if all hosts have failed and the running result is not ok 27712 1727096505.93195: done checking to see if all hosts have failed 27712 1727096505.93196: getting the remaining hosts for this loop 27712 1727096505.93197: done getting the remaining hosts for this loop 27712 1727096505.93201: getting the next task for host managed_node2 27712 1727096505.93208: done getting next task for host managed_node2 27712 1727096505.93211: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 27712 1727096505.93215: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096505.93348: getting variables 27712 1727096505.93350: in VariableManager get_vars() 27712 1727096505.93389: Calling all_inventory to load vars for managed_node2 27712 1727096505.93392: Calling groups_inventory to load vars for managed_node2 27712 1727096505.93395: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096505.93403: Calling all_plugins_play to load vars for managed_node2 27712 1727096505.93406: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096505.93409: Calling groups_plugins_play to load vars for managed_node2 27712 1727096505.94600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096505.95460: done with get_vars() 27712 1727096505.95479: done getting variables 27712 1727096505.95523: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 09:01:45 -0400 (0:00:00.072) 0:00:31.648 ****** 27712 1727096505.95545: entering _queue_task() for managed_node2/service 27712 1727096505.95784: worker is 1 (out of 1 available) 27712 1727096505.95799: exiting _queue_task() for managed_node2/service 27712 1727096505.95812: done queuing things up, now waiting for results queue to drain 27712 1727096505.95814: waiting for pending results... 27712 1727096505.96056: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 27712 1727096505.96374: in run() - task 0afff68d-5257-cbc7-8716-00000000064f 27712 1727096505.96379: variable 'ansible_search_path' from source: unknown 27712 1727096505.96382: variable 'ansible_search_path' from source: unknown 27712 1727096505.96385: calling self._execute() 27712 1727096505.96387: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096505.96389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096505.96392: variable 'omit' from source: magic vars 27712 1727096505.96685: variable 'ansible_distribution_major_version' from source: facts 27712 1727096505.96694: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096505.96797: variable 'network_provider' from source: set_fact 27712 1727096505.96801: Evaluated conditional (network_provider == "initscripts"): False 27712 1727096505.96804: when evaluation is False, skipping this task 27712 1727096505.96807: _execute() done 27712 1727096505.96809: dumping result to json 27712 1727096505.96812: done dumping result, returning 27712 1727096505.96820: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-cbc7-8716-00000000064f] 27712 1727096505.96824: sending task result for task 0afff68d-5257-cbc7-8716-00000000064f 27712 1727096505.96942: done sending task result for task 0afff68d-5257-cbc7-8716-00000000064f skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 27712 1727096505.97098: no more pending results, returning what we have 27712 1727096505.97101: results queue empty 27712 1727096505.97102: checking for any_errors_fatal 27712 1727096505.97107: done checking for any_errors_fatal 27712 1727096505.97108: checking for max_fail_percentage 27712 1727096505.97109: done checking for max_fail_percentage 27712 1727096505.97110: checking to see if all hosts have failed and the running result is not ok 27712 1727096505.97111: done checking to see if all hosts have failed 27712 1727096505.97112: getting the remaining hosts for this loop 27712 1727096505.97113: done getting the remaining hosts for this loop 27712 1727096505.97115: getting the next task for host managed_node2 27712 1727096505.97121: done getting next task for host managed_node2 27712 1727096505.97135: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27712 1727096505.97139: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096505.97156: getting variables 27712 1727096505.97165: in VariableManager get_vars() 27712 1727096505.97208: Calling all_inventory to load vars for managed_node2 27712 1727096505.97210: Calling groups_inventory to load vars for managed_node2 27712 1727096505.97213: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096505.97219: WORKER PROCESS EXITING 27712 1727096505.97227: Calling all_plugins_play to load vars for managed_node2 27712 1727096505.97230: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096505.97233: Calling groups_plugins_play to load vars for managed_node2 27712 1727096505.98420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096505.99520: done with get_vars() 27712 1727096505.99542: done getting variables 27712 1727096505.99615: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 09:01:45 -0400 (0:00:00.041) 0:00:31.689 ****** 27712 1727096505.99652: entering _queue_task() for managed_node2/copy 27712 1727096506.00029: worker is 1 (out of 1 available) 27712 1727096506.00043: exiting _queue_task() for managed_node2/copy 27712 1727096506.00058: done queuing things up, now waiting for results queue to drain 27712 1727096506.00060: waiting for pending results... 27712 1727096506.00450: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 27712 1727096506.00673: in run() - task 0afff68d-5257-cbc7-8716-000000000650 27712 1727096506.00677: variable 'ansible_search_path' from source: unknown 27712 1727096506.00680: variable 'ansible_search_path' from source: unknown 27712 1727096506.00691: calling self._execute() 27712 1727096506.00804: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096506.00819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096506.00833: variable 'omit' from source: magic vars 27712 1727096506.01256: variable 'ansible_distribution_major_version' from source: facts 27712 1727096506.01266: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096506.01351: variable 'network_provider' from source: set_fact 27712 1727096506.01357: Evaluated conditional (network_provider == "initscripts"): False 27712 1727096506.01360: when evaluation is False, skipping this task 27712 1727096506.01364: _execute() done 27712 1727096506.01366: dumping result to json 27712 1727096506.01370: done dumping result, returning 27712 1727096506.01377: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-cbc7-8716-000000000650] 27712 1727096506.01379: sending task result for task 0afff68d-5257-cbc7-8716-000000000650 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 27712 1727096506.01523: no more pending results, returning what we have 27712 1727096506.01527: results queue empty 27712 1727096506.01528: checking for any_errors_fatal 27712 1727096506.01533: done checking for any_errors_fatal 27712 1727096506.01534: checking for max_fail_percentage 27712 1727096506.01536: done checking for max_fail_percentage 27712 1727096506.01537: checking to see if all hosts have failed and the running result is not ok 27712 1727096506.01537: done checking to see if all hosts have failed 27712 1727096506.01538: getting the remaining hosts for this loop 27712 1727096506.01539: done getting the remaining hosts for this loop 27712 1727096506.01542: getting the next task for host managed_node2 27712 1727096506.01549: done getting next task for host managed_node2 27712 1727096506.01552: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27712 1727096506.01556: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096506.01579: getting variables 27712 1727096506.01581: in VariableManager get_vars() 27712 1727096506.01614: Calling all_inventory to load vars for managed_node2 27712 1727096506.01617: Calling groups_inventory to load vars for managed_node2 27712 1727096506.01619: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096506.01627: Calling all_plugins_play to load vars for managed_node2 27712 1727096506.01629: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096506.01632: Calling groups_plugins_play to load vars for managed_node2 27712 1727096506.02181: done sending task result for task 0afff68d-5257-cbc7-8716-000000000650 27712 1727096506.02185: WORKER PROCESS EXITING 27712 1727096506.02421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096506.03925: done with get_vars() 27712 1727096506.03947: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 09:01:46 -0400 (0:00:00.043) 0:00:31.733 ****** 27712 1727096506.04030: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 27712 1727096506.04317: worker is 1 (out of 1 available) 27712 1727096506.04329: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 27712 1727096506.04341: done queuing things up, now waiting for results queue to drain 27712 1727096506.04343: waiting for pending results... 27712 1727096506.04683: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 27712 1727096506.04757: in run() - task 0afff68d-5257-cbc7-8716-000000000651 27712 1727096506.04819: variable 'ansible_search_path' from source: unknown 27712 1727096506.04823: variable 'ansible_search_path' from source: unknown 27712 1727096506.04825: calling self._execute() 27712 1727096506.04915: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096506.04926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096506.04933: variable 'omit' from source: magic vars 27712 1727096506.05299: variable 'ansible_distribution_major_version' from source: facts 27712 1727096506.05310: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096506.05316: variable 'omit' from source: magic vars 27712 1727096506.05378: variable 'omit' from source: magic vars 27712 1727096506.05531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 27712 1727096506.07620: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 27712 1727096506.07691: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 27712 1727096506.07722: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 27712 1727096506.07825: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 27712 1727096506.07829: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 27712 1727096506.07871: variable 'network_provider' from source: set_fact 27712 1727096506.08006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 27712 1727096506.08039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 27712 1727096506.08058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 27712 1727096506.08109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 27712 1727096506.08123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 27712 1727096506.08266: variable 'omit' from source: magic vars 27712 1727096506.08307: variable 'omit' from source: magic vars 27712 1727096506.08418: variable 'network_connections' from source: include params 27712 1727096506.08429: variable 'interface0' from source: play vars 27712 1727096506.08495: variable 'interface0' from source: play vars 27712 1727096506.08511: variable 'interface1' from source: play vars 27712 1727096506.08584: variable 'interface1' from source: play vars 27712 1727096506.08975: variable 'omit' from source: magic vars 27712 1727096506.08978: variable '__lsr_ansible_managed' from source: task vars 27712 1727096506.08980: variable '__lsr_ansible_managed' from source: task vars 27712 1727096506.09383: Loaded config def from plugin (lookup/template) 27712 1727096506.09387: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 27712 1727096506.09413: File lookup term: get_ansible_managed.j2 27712 1727096506.09417: variable 'ansible_search_path' from source: unknown 27712 1727096506.09422: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 27712 1727096506.09436: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 27712 1727096506.09451: variable 'ansible_search_path' from source: unknown 27712 1727096506.15604: variable 'ansible_managed' from source: unknown 27712 1727096506.15744: variable 'omit' from source: magic vars 27712 1727096506.15781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096506.15814: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096506.15843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096506.15865: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096506.15885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096506.15914: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096506.15922: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096506.15929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096506.16031: Set connection var ansible_connection to ssh 27712 1727096506.16044: Set connection var ansible_pipelining to False 27712 1727096506.16060: Set connection var ansible_timeout to 10 27712 1727096506.16066: Set connection var ansible_shell_type to sh 27712 1727096506.16083: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096506.16092: Set connection var ansible_shell_executable to /bin/sh 27712 1727096506.16119: variable 'ansible_shell_executable' from source: unknown 27712 1727096506.16128: variable 'ansible_connection' from source: unknown 27712 1727096506.16134: variable 'ansible_module_compression' from source: unknown 27712 1727096506.16140: variable 'ansible_shell_type' from source: unknown 27712 1727096506.16146: variable 'ansible_shell_executable' from source: unknown 27712 1727096506.16152: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096506.16166: variable 'ansible_pipelining' from source: unknown 27712 1727096506.16178: variable 'ansible_timeout' from source: unknown 27712 1727096506.16186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096506.16323: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096506.16345: variable 'omit' from source: magic vars 27712 1727096506.16355: starting attempt loop 27712 1727096506.16362: running the handler 27712 1727096506.16392: _low_level_execute_command(): starting 27712 1727096506.16403: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096506.17111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096506.17157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096506.17241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096506.17263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096506.17315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096506.17343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096506.19079: stdout chunk (state=3): >>>/root <<< 27712 1727096506.19187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096506.19191: stderr chunk (state=3): >>><<< 27712 1727096506.19193: stdout chunk (state=3): >>><<< 27712 1727096506.19337: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096506.19341: _low_level_execute_command(): starting 27712 1727096506.19343: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096506.1922162-29183-67968565654446 `" && echo ansible-tmp-1727096506.1922162-29183-67968565654446="` echo /root/.ansible/tmp/ansible-tmp-1727096506.1922162-29183-67968565654446 `" ) && sleep 0' 27712 1727096506.20182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096506.20185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096506.20188: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096506.20190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096506.20192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096506.20194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096506.20235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096506.20291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096506.20310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096506.22209: stdout chunk (state=3): >>>ansible-tmp-1727096506.1922162-29183-67968565654446=/root/.ansible/tmp/ansible-tmp-1727096506.1922162-29183-67968565654446 <<< 27712 1727096506.22476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096506.22480: stdout chunk (state=3): >>><<< 27712 1727096506.22482: stderr chunk (state=3): >>><<< 27712 1727096506.22485: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096506.1922162-29183-67968565654446=/root/.ansible/tmp/ansible-tmp-1727096506.1922162-29183-67968565654446 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096506.22488: variable 'ansible_module_compression' from source: unknown 27712 1727096506.22512: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 27712 1727096506.22569: variable 'ansible_facts' from source: unknown 27712 1727096506.22730: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096506.1922162-29183-67968565654446/AnsiballZ_network_connections.py 27712 1727096506.22958: Sending initial data 27712 1727096506.22961: Sent initial data (167 bytes) 27712 1727096506.23748: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096506.23751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096506.23754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096506.23756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096506.23762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096506.23764: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096506.23769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096506.23775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096506.23820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096506.25390: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 27712 1727096506.25404: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 27712 1727096506.25424: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 27712 1727096506.25452: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096506.25496: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096506.25546: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpbxvldyu7 /root/.ansible/tmp/ansible-tmp-1727096506.1922162-29183-67968565654446/AnsiballZ_network_connections.py <<< 27712 1727096506.25549: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096506.1922162-29183-67968565654446/AnsiballZ_network_connections.py" <<< 27712 1727096506.25587: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpbxvldyu7" to remote "/root/.ansible/tmp/ansible-tmp-1727096506.1922162-29183-67968565654446/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096506.1922162-29183-67968565654446/AnsiballZ_network_connections.py" <<< 27712 1727096506.26697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096506.26709: stderr chunk (state=3): >>><<< 27712 1727096506.26718: stdout chunk (state=3): >>><<< 27712 1727096506.26781: done transferring module to remote 27712 1727096506.26784: _low_level_execute_command(): starting 27712 1727096506.26787: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096506.1922162-29183-67968565654446/ /root/.ansible/tmp/ansible-tmp-1727096506.1922162-29183-67968565654446/AnsiballZ_network_connections.py && sleep 0' 27712 1727096506.27575: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096506.27591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096506.27607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096506.27625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096506.27645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096506.27750: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096506.27782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096506.27844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096506.29652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096506.29658: stdout chunk (state=3): >>><<< 27712 1727096506.29665: stderr chunk (state=3): >>><<< 27712 1727096506.29685: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096506.29707: _low_level_execute_command(): starting 27712 1727096506.29710: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096506.1922162-29183-67968565654446/AnsiballZ_network_connections.py && sleep 0' 27712 1727096506.30283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096506.30319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096506.30339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096506.30361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096506.30431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096506.71494: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fu9mtyt4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fu9mtyt4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/6432d2d2-f377-4efe-a5e3-d4d7172e455e: error=unknown <<< 27712 1727096506.73098: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fu9mtyt4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fu9mtyt4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest1/82850e40-6086-456d-9524-f262534a63fa: error=unknown <<< 27712 1727096506.73208: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 27712 1727096506.75344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096506.75347: stdout chunk (state=3): >>><<< 27712 1727096506.75349: stderr chunk (state=3): >>><<< 27712 1727096506.75481: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fu9mtyt4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fu9mtyt4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/6432d2d2-f377-4efe-a5e3-d4d7172e455e: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fu9mtyt4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fu9mtyt4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest1/82850e40-6086-456d-9524-f262534a63fa: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096506.75485: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'ethtest1', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096506.1922162-29183-67968565654446/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096506.75487: _low_level_execute_command(): starting 27712 1727096506.75489: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096506.1922162-29183-67968565654446/ > /dev/null 2>&1 && sleep 0' 27712 1727096506.76285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096506.76339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096506.76366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096506.78212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096506.78243: stderr chunk (state=3): >>><<< 27712 1727096506.78253: stdout chunk (state=3): >>><<< 27712 1727096506.78673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096506.78677: handler run complete 27712 1727096506.78679: attempt loop complete, returning result 27712 1727096506.78681: _execute() done 27712 1727096506.78683: dumping result to json 27712 1727096506.78685: done dumping result, returning 27712 1727096506.78687: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-cbc7-8716-000000000651] 27712 1727096506.78689: sending task result for task 0afff68d-5257-cbc7-8716-000000000651 27712 1727096506.78776: done sending task result for task 0afff68d-5257-cbc7-8716-000000000651 27712 1727096506.78780: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent", "state": "down" }, { "name": "ethtest1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 27712 1727096506.78891: no more pending results, returning what we have 27712 1727096506.78895: results queue empty 27712 1727096506.78896: checking for any_errors_fatal 27712 1727096506.78903: done checking for any_errors_fatal 27712 1727096506.78904: checking for max_fail_percentage 27712 1727096506.78906: done checking for max_fail_percentage 27712 1727096506.78907: checking to see if all hosts have failed and the running result is not ok 27712 1727096506.78907: done checking to see if all hosts have failed 27712 1727096506.78908: getting the remaining hosts for this loop 27712 1727096506.78910: done getting the remaining hosts for this loop 27712 1727096506.78913: getting the next task for host managed_node2 27712 1727096506.78921: done getting next task for host managed_node2 27712 1727096506.78924: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 27712 1727096506.78929: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096506.78940: getting variables 27712 1727096506.78941: in VariableManager get_vars() 27712 1727096506.79389: Calling all_inventory to load vars for managed_node2 27712 1727096506.79392: Calling groups_inventory to load vars for managed_node2 27712 1727096506.79395: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096506.79405: Calling all_plugins_play to load vars for managed_node2 27712 1727096506.79408: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096506.79411: Calling groups_plugins_play to load vars for managed_node2 27712 1727096506.82872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096506.85163: done with get_vars() 27712 1727096506.85196: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 09:01:46 -0400 (0:00:00.812) 0:00:32.546 ****** 27712 1727096506.85304: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 27712 1727096506.85702: worker is 1 (out of 1 available) 27712 1727096506.85713: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 27712 1727096506.85841: done queuing things up, now waiting for results queue to drain 27712 1727096506.85843: waiting for pending results... 27712 1727096506.86040: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 27712 1727096506.86285: in run() - task 0afff68d-5257-cbc7-8716-000000000652 27712 1727096506.86289: variable 'ansible_search_path' from source: unknown 27712 1727096506.86292: variable 'ansible_search_path' from source: unknown 27712 1727096506.86295: calling self._execute() 27712 1727096506.86364: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096506.86378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096506.86401: variable 'omit' from source: magic vars 27712 1727096506.86783: variable 'ansible_distribution_major_version' from source: facts 27712 1727096506.86801: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096506.86938: variable 'network_state' from source: role '' defaults 27712 1727096506.86955: Evaluated conditional (network_state != {}): False 27712 1727096506.86963: when evaluation is False, skipping this task 27712 1727096506.86973: _execute() done 27712 1727096506.86986: dumping result to json 27712 1727096506.87017: done dumping result, returning 27712 1727096506.87029: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-cbc7-8716-000000000652] 27712 1727096506.87154: sending task result for task 0afff68d-5257-cbc7-8716-000000000652 27712 1727096506.87226: done sending task result for task 0afff68d-5257-cbc7-8716-000000000652 27712 1727096506.87230: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 27712 1727096506.87315: no more pending results, returning what we have 27712 1727096506.87319: results queue empty 27712 1727096506.87320: checking for any_errors_fatal 27712 1727096506.87335: done checking for any_errors_fatal 27712 1727096506.87336: checking for max_fail_percentage 27712 1727096506.87338: done checking for max_fail_percentage 27712 1727096506.87339: checking to see if all hosts have failed and the running result is not ok 27712 1727096506.87340: done checking to see if all hosts have failed 27712 1727096506.87341: getting the remaining hosts for this loop 27712 1727096506.87342: done getting the remaining hosts for this loop 27712 1727096506.87346: getting the next task for host managed_node2 27712 1727096506.87353: done getting next task for host managed_node2 27712 1727096506.87357: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27712 1727096506.87362: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096506.87387: getting variables 27712 1727096506.87389: in VariableManager get_vars() 27712 1727096506.87432: Calling all_inventory to load vars for managed_node2 27712 1727096506.87435: Calling groups_inventory to load vars for managed_node2 27712 1727096506.87438: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096506.87451: Calling all_plugins_play to load vars for managed_node2 27712 1727096506.87456: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096506.87459: Calling groups_plugins_play to load vars for managed_node2 27712 1727096506.89270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096506.90871: done with get_vars() 27712 1727096506.90893: done getting variables 27712 1727096506.90953: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 09:01:46 -0400 (0:00:00.056) 0:00:32.603 ****** 27712 1727096506.91002: entering _queue_task() for managed_node2/debug 27712 1727096506.91398: worker is 1 (out of 1 available) 27712 1727096506.91411: exiting _queue_task() for managed_node2/debug 27712 1727096506.91422: done queuing things up, now waiting for results queue to drain 27712 1727096506.91423: waiting for pending results... 27712 1727096506.91680: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 27712 1727096506.91929: in run() - task 0afff68d-5257-cbc7-8716-000000000653 27712 1727096506.91932: variable 'ansible_search_path' from source: unknown 27712 1727096506.91935: variable 'ansible_search_path' from source: unknown 27712 1727096506.91938: calling self._execute() 27712 1727096506.91998: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096506.92013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096506.92034: variable 'omit' from source: magic vars 27712 1727096506.92435: variable 'ansible_distribution_major_version' from source: facts 27712 1727096506.92453: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096506.92464: variable 'omit' from source: magic vars 27712 1727096506.92534: variable 'omit' from source: magic vars 27712 1727096506.92572: variable 'omit' from source: magic vars 27712 1727096506.92625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096506.92924: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096506.92928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096506.92930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096506.92933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096506.92947: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096506.93033: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096506.93037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096506.93191: Set connection var ansible_connection to ssh 27712 1727096506.93209: Set connection var ansible_pipelining to False 27712 1727096506.93220: Set connection var ansible_timeout to 10 27712 1727096506.93227: Set connection var ansible_shell_type to sh 27712 1727096506.93240: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096506.93251: Set connection var ansible_shell_executable to /bin/sh 27712 1727096506.93289: variable 'ansible_shell_executable' from source: unknown 27712 1727096506.93417: variable 'ansible_connection' from source: unknown 27712 1727096506.93420: variable 'ansible_module_compression' from source: unknown 27712 1727096506.93423: variable 'ansible_shell_type' from source: unknown 27712 1727096506.93425: variable 'ansible_shell_executable' from source: unknown 27712 1727096506.93426: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096506.93428: variable 'ansible_pipelining' from source: unknown 27712 1727096506.93430: variable 'ansible_timeout' from source: unknown 27712 1727096506.93431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096506.93743: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096506.93747: variable 'omit' from source: magic vars 27712 1727096506.93755: starting attempt loop 27712 1727096506.93761: running the handler 27712 1727096506.94030: variable '__network_connections_result' from source: set_fact 27712 1727096506.94091: handler run complete 27712 1727096506.94116: attempt loop complete, returning result 27712 1727096506.94123: _execute() done 27712 1727096506.94135: dumping result to json 27712 1727096506.94143: done dumping result, returning 27712 1727096506.94173: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-cbc7-8716-000000000653] 27712 1727096506.94191: sending task result for task 0afff68d-5257-cbc7-8716-000000000653 27712 1727096506.94318: done sending task result for task 0afff68d-5257-cbc7-8716-000000000653 27712 1727096506.94321: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 27712 1727096506.94417: no more pending results, returning what we have 27712 1727096506.94421: results queue empty 27712 1727096506.94422: checking for any_errors_fatal 27712 1727096506.94430: done checking for any_errors_fatal 27712 1727096506.94430: checking for max_fail_percentage 27712 1727096506.94432: done checking for max_fail_percentage 27712 1727096506.94433: checking to see if all hosts have failed and the running result is not ok 27712 1727096506.94434: done checking to see if all hosts have failed 27712 1727096506.94435: getting the remaining hosts for this loop 27712 1727096506.94436: done getting the remaining hosts for this loop 27712 1727096506.94441: getting the next task for host managed_node2 27712 1727096506.94448: done getting next task for host managed_node2 27712 1727096506.94452: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27712 1727096506.94457: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096506.94471: getting variables 27712 1727096506.94472: in VariableManager get_vars() 27712 1727096506.94515: Calling all_inventory to load vars for managed_node2 27712 1727096506.94519: Calling groups_inventory to load vars for managed_node2 27712 1727096506.94521: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096506.94532: Calling all_plugins_play to load vars for managed_node2 27712 1727096506.94535: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096506.94539: Calling groups_plugins_play to load vars for managed_node2 27712 1727096506.96311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096506.98537: done with get_vars() 27712 1727096506.98561: done getting variables 27712 1727096506.98626: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 09:01:46 -0400 (0:00:00.076) 0:00:32.679 ****** 27712 1727096506.98661: entering _queue_task() for managed_node2/debug 27712 1727096506.99105: worker is 1 (out of 1 available) 27712 1727096506.99117: exiting _queue_task() for managed_node2/debug 27712 1727096506.99129: done queuing things up, now waiting for results queue to drain 27712 1727096506.99130: waiting for pending results... 27712 1727096506.99715: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 27712 1727096506.99806: in run() - task 0afff68d-5257-cbc7-8716-000000000654 27712 1727096506.99843: variable 'ansible_search_path' from source: unknown 27712 1727096506.99846: variable 'ansible_search_path' from source: unknown 27712 1727096506.99854: calling self._execute() 27712 1727096507.00024: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096507.00028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096507.00032: variable 'omit' from source: magic vars 27712 1727096507.00321: variable 'ansible_distribution_major_version' from source: facts 27712 1727096507.00333: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096507.00338: variable 'omit' from source: magic vars 27712 1727096507.00399: variable 'omit' from source: magic vars 27712 1727096507.00433: variable 'omit' from source: magic vars 27712 1727096507.00474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096507.00511: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096507.00529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096507.00546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096507.00570: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096507.00591: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096507.00675: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096507.00679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096507.00772: Set connection var ansible_connection to ssh 27712 1727096507.00781: Set connection var ansible_pipelining to False 27712 1727096507.00784: Set connection var ansible_timeout to 10 27712 1727096507.00786: Set connection var ansible_shell_type to sh 27712 1727096507.00790: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096507.00792: Set connection var ansible_shell_executable to /bin/sh 27712 1727096507.00794: variable 'ansible_shell_executable' from source: unknown 27712 1727096507.00797: variable 'ansible_connection' from source: unknown 27712 1727096507.00799: variable 'ansible_module_compression' from source: unknown 27712 1727096507.00801: variable 'ansible_shell_type' from source: unknown 27712 1727096507.00803: variable 'ansible_shell_executable' from source: unknown 27712 1727096507.00805: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096507.00807: variable 'ansible_pipelining' from source: unknown 27712 1727096507.00809: variable 'ansible_timeout' from source: unknown 27712 1727096507.00811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096507.01002: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096507.01006: variable 'omit' from source: magic vars 27712 1727096507.01008: starting attempt loop 27712 1727096507.01011: running the handler 27712 1727096507.01013: variable '__network_connections_result' from source: set_fact 27712 1727096507.01046: variable '__network_connections_result' from source: set_fact 27712 1727096507.01163: handler run complete 27712 1727096507.01194: attempt loop complete, returning result 27712 1727096507.01197: _execute() done 27712 1727096507.01200: dumping result to json 27712 1727096507.01202: done dumping result, returning 27712 1727096507.01216: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-cbc7-8716-000000000654] 27712 1727096507.01219: sending task result for task 0afff68d-5257-cbc7-8716-000000000654 27712 1727096507.01436: done sending task result for task 0afff68d-5257-cbc7-8716-000000000654 27712 1727096507.01439: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent", "state": "down" }, { "name": "ethtest1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 27712 1727096507.01522: no more pending results, returning what we have 27712 1727096507.01525: results queue empty 27712 1727096507.01526: checking for any_errors_fatal 27712 1727096507.01531: done checking for any_errors_fatal 27712 1727096507.01531: checking for max_fail_percentage 27712 1727096507.01533: done checking for max_fail_percentage 27712 1727096507.01533: checking to see if all hosts have failed and the running result is not ok 27712 1727096507.01534: done checking to see if all hosts have failed 27712 1727096507.01535: getting the remaining hosts for this loop 27712 1727096507.01536: done getting the remaining hosts for this loop 27712 1727096507.01607: getting the next task for host managed_node2 27712 1727096507.01614: done getting next task for host managed_node2 27712 1727096507.01617: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27712 1727096507.01621: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096507.01633: getting variables 27712 1727096507.01634: in VariableManager get_vars() 27712 1727096507.01686: Calling all_inventory to load vars for managed_node2 27712 1727096507.01689: Calling groups_inventory to load vars for managed_node2 27712 1727096507.01691: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096507.01699: Calling all_plugins_play to load vars for managed_node2 27712 1727096507.01704: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096507.01707: Calling groups_plugins_play to load vars for managed_node2 27712 1727096507.03717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096507.05457: done with get_vars() 27712 1727096507.05482: done getting variables 27712 1727096507.05547: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 09:01:47 -0400 (0:00:00.069) 0:00:32.749 ****** 27712 1727096507.05583: entering _queue_task() for managed_node2/debug 27712 1727096507.05944: worker is 1 (out of 1 available) 27712 1727096507.05957: exiting _queue_task() for managed_node2/debug 27712 1727096507.05975: done queuing things up, now waiting for results queue to drain 27712 1727096507.05977: waiting for pending results... 27712 1727096507.06295: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 27712 1727096507.06381: in run() - task 0afff68d-5257-cbc7-8716-000000000655 27712 1727096507.06395: variable 'ansible_search_path' from source: unknown 27712 1727096507.06399: variable 'ansible_search_path' from source: unknown 27712 1727096507.06502: calling self._execute() 27712 1727096507.06528: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096507.06534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096507.06543: variable 'omit' from source: magic vars 27712 1727096507.06914: variable 'ansible_distribution_major_version' from source: facts 27712 1727096507.06927: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096507.07046: variable 'network_state' from source: role '' defaults 27712 1727096507.07061: Evaluated conditional (network_state != {}): False 27712 1727096507.07065: when evaluation is False, skipping this task 27712 1727096507.07069: _execute() done 27712 1727096507.07071: dumping result to json 27712 1727096507.07074: done dumping result, returning 27712 1727096507.07082: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-cbc7-8716-000000000655] 27712 1727096507.07087: sending task result for task 0afff68d-5257-cbc7-8716-000000000655 27712 1727096507.07387: done sending task result for task 0afff68d-5257-cbc7-8716-000000000655 27712 1727096507.07391: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 27712 1727096507.07425: no more pending results, returning what we have 27712 1727096507.07428: results queue empty 27712 1727096507.07429: checking for any_errors_fatal 27712 1727096507.07435: done checking for any_errors_fatal 27712 1727096507.07436: checking for max_fail_percentage 27712 1727096507.07438: done checking for max_fail_percentage 27712 1727096507.07438: checking to see if all hosts have failed and the running result is not ok 27712 1727096507.07439: done checking to see if all hosts have failed 27712 1727096507.07440: getting the remaining hosts for this loop 27712 1727096507.07441: done getting the remaining hosts for this loop 27712 1727096507.07444: getting the next task for host managed_node2 27712 1727096507.07451: done getting next task for host managed_node2 27712 1727096507.07454: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 27712 1727096507.07459: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096507.07478: getting variables 27712 1727096507.07480: in VariableManager get_vars() 27712 1727096507.07516: Calling all_inventory to load vars for managed_node2 27712 1727096507.07519: Calling groups_inventory to load vars for managed_node2 27712 1727096507.07522: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096507.07530: Calling all_plugins_play to load vars for managed_node2 27712 1727096507.07533: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096507.07536: Calling groups_plugins_play to load vars for managed_node2 27712 1727096507.08844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096507.10452: done with get_vars() 27712 1727096507.10483: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 09:01:47 -0400 (0:00:00.050) 0:00:32.799 ****** 27712 1727096507.10587: entering _queue_task() for managed_node2/ping 27712 1727096507.11066: worker is 1 (out of 1 available) 27712 1727096507.11077: exiting _queue_task() for managed_node2/ping 27712 1727096507.11088: done queuing things up, now waiting for results queue to drain 27712 1727096507.11089: waiting for pending results... 27712 1727096507.11289: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 27712 1727096507.11493: in run() - task 0afff68d-5257-cbc7-8716-000000000656 27712 1727096507.11498: variable 'ansible_search_path' from source: unknown 27712 1727096507.11500: variable 'ansible_search_path' from source: unknown 27712 1727096507.11513: calling self._execute() 27712 1727096507.11621: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096507.11707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096507.11712: variable 'omit' from source: magic vars 27712 1727096507.12061: variable 'ansible_distribution_major_version' from source: facts 27712 1727096507.12082: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096507.12095: variable 'omit' from source: magic vars 27712 1727096507.12158: variable 'omit' from source: magic vars 27712 1727096507.12197: variable 'omit' from source: magic vars 27712 1727096507.12241: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096507.12291: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096507.12315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096507.12337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096507.12366: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096507.12476: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096507.12479: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096507.12481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096507.12519: Set connection var ansible_connection to ssh 27712 1727096507.12534: Set connection var ansible_pipelining to False 27712 1727096507.12545: Set connection var ansible_timeout to 10 27712 1727096507.12552: Set connection var ansible_shell_type to sh 27712 1727096507.12565: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096507.12584: Set connection var ansible_shell_executable to /bin/sh 27712 1727096507.12611: variable 'ansible_shell_executable' from source: unknown 27712 1727096507.12620: variable 'ansible_connection' from source: unknown 27712 1727096507.12628: variable 'ansible_module_compression' from source: unknown 27712 1727096507.12636: variable 'ansible_shell_type' from source: unknown 27712 1727096507.12643: variable 'ansible_shell_executable' from source: unknown 27712 1727096507.12652: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096507.12660: variable 'ansible_pipelining' from source: unknown 27712 1727096507.12670: variable 'ansible_timeout' from source: unknown 27712 1727096507.12680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096507.12911: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096507.12916: variable 'omit' from source: magic vars 27712 1727096507.12928: starting attempt loop 27712 1727096507.12972: running the handler 27712 1727096507.12976: _low_level_execute_command(): starting 27712 1727096507.12978: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096507.13790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096507.13808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096507.13834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096507.13881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096507.13929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096507.15619: stdout chunk (state=3): >>>/root <<< 27712 1727096507.15782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096507.15786: stdout chunk (state=3): >>><<< 27712 1727096507.15788: stderr chunk (state=3): >>><<< 27712 1727096507.15809: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096507.15829: _low_level_execute_command(): starting 27712 1727096507.15840: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096507.1581628-29251-15368174823064 `" && echo ansible-tmp-1727096507.1581628-29251-15368174823064="` echo /root/.ansible/tmp/ansible-tmp-1727096507.1581628-29251-15368174823064 `" ) && sleep 0' 27712 1727096507.16439: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096507.16454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096507.16472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096507.16490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096507.16514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096507.16530: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096507.16629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096507.16685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096507.16723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096507.18697: stdout chunk (state=3): >>>ansible-tmp-1727096507.1581628-29251-15368174823064=/root/.ansible/tmp/ansible-tmp-1727096507.1581628-29251-15368174823064 <<< 27712 1727096507.18856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096507.18860: stdout chunk (state=3): >>><<< 27712 1727096507.18863: stderr chunk (state=3): >>><<< 27712 1727096507.18887: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096507.1581628-29251-15368174823064=/root/.ansible/tmp/ansible-tmp-1727096507.1581628-29251-15368174823064 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096507.18944: variable 'ansible_module_compression' from source: unknown 27712 1727096507.19073: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 27712 1727096507.19076: variable 'ansible_facts' from source: unknown 27712 1727096507.19141: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096507.1581628-29251-15368174823064/AnsiballZ_ping.py 27712 1727096507.19292: Sending initial data 27712 1727096507.19414: Sent initial data (152 bytes) 27712 1727096507.19969: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096507.20082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096507.20099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096507.20119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096507.20194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096507.21804: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096507.21863: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096507.21914: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpcegnixi0 /root/.ansible/tmp/ansible-tmp-1727096507.1581628-29251-15368174823064/AnsiballZ_ping.py <<< 27712 1727096507.21917: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096507.1581628-29251-15368174823064/AnsiballZ_ping.py" <<< 27712 1727096507.21960: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpcegnixi0" to remote "/root/.ansible/tmp/ansible-tmp-1727096507.1581628-29251-15368174823064/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096507.1581628-29251-15368174823064/AnsiballZ_ping.py" <<< 27712 1727096507.22735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096507.22739: stderr chunk (state=3): >>><<< 27712 1727096507.22741: stdout chunk (state=3): >>><<< 27712 1727096507.22752: done transferring module to remote 27712 1727096507.22769: _low_level_execute_command(): starting 27712 1727096507.22782: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096507.1581628-29251-15368174823064/ /root/.ansible/tmp/ansible-tmp-1727096507.1581628-29251-15368174823064/AnsiballZ_ping.py && sleep 0' 27712 1727096507.23435: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096507.23448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096507.23462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096507.23487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096507.23533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096507.23613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096507.23631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096507.23670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096507.23706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096507.25622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096507.25626: stdout chunk (state=3): >>><<< 27712 1727096507.25628: stderr chunk (state=3): >>><<< 27712 1727096507.25651: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096507.25675: _low_level_execute_command(): starting 27712 1727096507.25678: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096507.1581628-29251-15368174823064/AnsiballZ_ping.py && sleep 0' 27712 1727096507.26326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096507.26342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096507.26358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096507.26380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096507.26399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096507.26411: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096507.26437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096507.26486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096507.26544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096507.26575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096507.26656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096507.42214: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 27712 1727096507.43695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096507.43699: stdout chunk (state=3): >>><<< 27712 1727096507.43701: stderr chunk (state=3): >>><<< 27712 1727096507.43776: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096507.43780: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096507.1581628-29251-15368174823064/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096507.43783: _low_level_execute_command(): starting 27712 1727096507.43785: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096507.1581628-29251-15368174823064/ > /dev/null 2>&1 && sleep 0' 27712 1727096507.44422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096507.44437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096507.44450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096507.44474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096507.44494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096507.44507: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096507.44527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096507.44585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096507.44636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096507.44659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096507.44684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096507.44754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096507.46632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096507.46677: stderr chunk (state=3): >>><<< 27712 1727096507.46681: stdout chunk (state=3): >>><<< 27712 1727096507.46692: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096507.46702: handler run complete 27712 1727096507.46712: attempt loop complete, returning result 27712 1727096507.46715: _execute() done 27712 1727096507.46718: dumping result to json 27712 1727096507.46720: done dumping result, returning 27712 1727096507.46728: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-cbc7-8716-000000000656] 27712 1727096507.46732: sending task result for task 0afff68d-5257-cbc7-8716-000000000656 27712 1727096507.46827: done sending task result for task 0afff68d-5257-cbc7-8716-000000000656 27712 1727096507.46830: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 27712 1727096507.46913: no more pending results, returning what we have 27712 1727096507.46917: results queue empty 27712 1727096507.46918: checking for any_errors_fatal 27712 1727096507.46923: done checking for any_errors_fatal 27712 1727096507.46924: checking for max_fail_percentage 27712 1727096507.46926: done checking for max_fail_percentage 27712 1727096507.46927: checking to see if all hosts have failed and the running result is not ok 27712 1727096507.46928: done checking to see if all hosts have failed 27712 1727096507.46928: getting the remaining hosts for this loop 27712 1727096507.46930: done getting the remaining hosts for this loop 27712 1727096507.46933: getting the next task for host managed_node2 27712 1727096507.46945: done getting next task for host managed_node2 27712 1727096507.46947: ^ task is: TASK: meta (role_complete) 27712 1727096507.46951: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096507.46962: getting variables 27712 1727096507.46963: in VariableManager get_vars() 27712 1727096507.47007: Calling all_inventory to load vars for managed_node2 27712 1727096507.47009: Calling groups_inventory to load vars for managed_node2 27712 1727096507.47011: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096507.47021: Calling all_plugins_play to load vars for managed_node2 27712 1727096507.47023: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096507.47026: Calling groups_plugins_play to load vars for managed_node2 27712 1727096507.47979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096507.49351: done with get_vars() 27712 1727096507.49370: done getting variables 27712 1727096507.49427: done queuing things up, now waiting for results queue to drain 27712 1727096507.49429: results queue empty 27712 1727096507.49429: checking for any_errors_fatal 27712 1727096507.49431: done checking for any_errors_fatal 27712 1727096507.49431: checking for max_fail_percentage 27712 1727096507.49432: done checking for max_fail_percentage 27712 1727096507.49432: checking to see if all hosts have failed and the running result is not ok 27712 1727096507.49433: done checking to see if all hosts have failed 27712 1727096507.49433: getting the remaining hosts for this loop 27712 1727096507.49434: done getting the remaining hosts for this loop 27712 1727096507.49436: getting the next task for host managed_node2 27712 1727096507.49439: done getting next task for host managed_node2 27712 1727096507.49440: ^ task is: TASK: Delete interface1 27712 1727096507.49442: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096507.49443: getting variables 27712 1727096507.49444: in VariableManager get_vars() 27712 1727096507.49461: Calling all_inventory to load vars for managed_node2 27712 1727096507.49462: Calling groups_inventory to load vars for managed_node2 27712 1727096507.49464: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096507.49469: Calling all_plugins_play to load vars for managed_node2 27712 1727096507.49470: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096507.49474: Calling groups_plugins_play to load vars for managed_node2 27712 1727096507.50102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096507.50941: done with get_vars() 27712 1727096507.50955: done getting variables TASK [Delete interface1] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:151 Monday 23 September 2024 09:01:47 -0400 (0:00:00.404) 0:00:33.203 ****** 27712 1727096507.51010: entering _queue_task() for managed_node2/include_tasks 27712 1727096507.51252: worker is 1 (out of 1 available) 27712 1727096507.51265: exiting _queue_task() for managed_node2/include_tasks 27712 1727096507.51280: done queuing things up, now waiting for results queue to drain 27712 1727096507.51281: waiting for pending results... 27712 1727096507.51588: running TaskExecutor() for managed_node2/TASK: Delete interface1 27712 1727096507.51602: in run() - task 0afff68d-5257-cbc7-8716-0000000000b5 27712 1727096507.51622: variable 'ansible_search_path' from source: unknown 27712 1727096507.51664: calling self._execute() 27712 1727096507.51766: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096507.51785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096507.51799: variable 'omit' from source: magic vars 27712 1727096507.52185: variable 'ansible_distribution_major_version' from source: facts 27712 1727096507.52375: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096507.52379: _execute() done 27712 1727096507.52382: dumping result to json 27712 1727096507.52384: done dumping result, returning 27712 1727096507.52387: done running TaskExecutor() for managed_node2/TASK: Delete interface1 [0afff68d-5257-cbc7-8716-0000000000b5] 27712 1727096507.52389: sending task result for task 0afff68d-5257-cbc7-8716-0000000000b5 27712 1727096507.52483: no more pending results, returning what we have 27712 1727096507.52487: in VariableManager get_vars() 27712 1727096507.52530: Calling all_inventory to load vars for managed_node2 27712 1727096507.52532: Calling groups_inventory to load vars for managed_node2 27712 1727096507.52534: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096507.52546: Calling all_plugins_play to load vars for managed_node2 27712 1727096507.52549: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096507.52552: Calling groups_plugins_play to load vars for managed_node2 27712 1727096507.53077: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000b5 27712 1727096507.53081: WORKER PROCESS EXITING 27712 1727096507.53997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096507.55560: done with get_vars() 27712 1727096507.55584: variable 'ansible_search_path' from source: unknown 27712 1727096507.55598: we have included files to process 27712 1727096507.55599: generating all_blocks data 27712 1727096507.55600: done generating all_blocks data 27712 1727096507.55606: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27712 1727096507.55607: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27712 1727096507.55609: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27712 1727096507.55827: done processing included file 27712 1727096507.55829: iterating over new_blocks loaded from include file 27712 1727096507.55831: in VariableManager get_vars() 27712 1727096507.55849: done with get_vars() 27712 1727096507.55852: filtering new block on tags 27712 1727096507.55882: done filtering new block on tags 27712 1727096507.55884: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 27712 1727096507.55890: extending task lists for all hosts with included blocks 27712 1727096507.57136: done extending task lists 27712 1727096507.57138: done processing included files 27712 1727096507.57139: results queue empty 27712 1727096507.57139: checking for any_errors_fatal 27712 1727096507.57141: done checking for any_errors_fatal 27712 1727096507.57142: checking for max_fail_percentage 27712 1727096507.57143: done checking for max_fail_percentage 27712 1727096507.57143: checking to see if all hosts have failed and the running result is not ok 27712 1727096507.57144: done checking to see if all hosts have failed 27712 1727096507.57145: getting the remaining hosts for this loop 27712 1727096507.57146: done getting the remaining hosts for this loop 27712 1727096507.57148: getting the next task for host managed_node2 27712 1727096507.57152: done getting next task for host managed_node2 27712 1727096507.57154: ^ task is: TASK: Remove test interface if necessary 27712 1727096507.57157: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096507.57160: getting variables 27712 1727096507.57160: in VariableManager get_vars() 27712 1727096507.57178: Calling all_inventory to load vars for managed_node2 27712 1727096507.57181: Calling groups_inventory to load vars for managed_node2 27712 1727096507.57183: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096507.57188: Calling all_plugins_play to load vars for managed_node2 27712 1727096507.57191: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096507.57194: Calling groups_plugins_play to load vars for managed_node2 27712 1727096507.58391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096507.59997: done with get_vars() 27712 1727096507.60017: done getting variables 27712 1727096507.60058: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Monday 23 September 2024 09:01:47 -0400 (0:00:00.090) 0:00:33.294 ****** 27712 1727096507.60092: entering _queue_task() for managed_node2/command 27712 1727096507.60434: worker is 1 (out of 1 available) 27712 1727096507.60445: exiting _queue_task() for managed_node2/command 27712 1727096507.60457: done queuing things up, now waiting for results queue to drain 27712 1727096507.60458: waiting for pending results... 27712 1727096507.60789: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 27712 1727096507.60870: in run() - task 0afff68d-5257-cbc7-8716-000000000777 27712 1727096507.60887: variable 'ansible_search_path' from source: unknown 27712 1727096507.60890: variable 'ansible_search_path' from source: unknown 27712 1727096507.60972: calling self._execute() 27712 1727096507.61026: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096507.61032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096507.61042: variable 'omit' from source: magic vars 27712 1727096507.61424: variable 'ansible_distribution_major_version' from source: facts 27712 1727096507.61478: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096507.61482: variable 'omit' from source: magic vars 27712 1727096507.61495: variable 'omit' from source: magic vars 27712 1727096507.61590: variable 'interface' from source: set_fact 27712 1727096507.61608: variable 'omit' from source: magic vars 27712 1727096507.61646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096507.61730: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096507.61734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096507.61736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096507.61743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096507.61776: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096507.61780: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096507.61782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096507.61886: Set connection var ansible_connection to ssh 27712 1727096507.61894: Set connection var ansible_pipelining to False 27712 1727096507.61950: Set connection var ansible_timeout to 10 27712 1727096507.61954: Set connection var ansible_shell_type to sh 27712 1727096507.61956: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096507.61958: Set connection var ansible_shell_executable to /bin/sh 27712 1727096507.61961: variable 'ansible_shell_executable' from source: unknown 27712 1727096507.61963: variable 'ansible_connection' from source: unknown 27712 1727096507.61966: variable 'ansible_module_compression' from source: unknown 27712 1727096507.61970: variable 'ansible_shell_type' from source: unknown 27712 1727096507.61973: variable 'ansible_shell_executable' from source: unknown 27712 1727096507.61975: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096507.61977: variable 'ansible_pipelining' from source: unknown 27712 1727096507.61979: variable 'ansible_timeout' from source: unknown 27712 1727096507.61981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096507.62136: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096507.62170: variable 'omit' from source: magic vars 27712 1727096507.62173: starting attempt loop 27712 1727096507.62176: running the handler 27712 1727096507.62178: _low_level_execute_command(): starting 27712 1727096507.62181: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096507.63042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096507.63125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096507.63146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096507.63261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096507.63295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096507.64971: stdout chunk (state=3): >>>/root <<< 27712 1727096507.65086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096507.65130: stderr chunk (state=3): >>><<< 27712 1727096507.65148: stdout chunk (state=3): >>><<< 27712 1727096507.65170: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096507.65222: _low_level_execute_command(): starting 27712 1727096507.65226: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096507.6517818-29264-261919948809778 `" && echo ansible-tmp-1727096507.6517818-29264-261919948809778="` echo /root/.ansible/tmp/ansible-tmp-1727096507.6517818-29264-261919948809778 `" ) && sleep 0' 27712 1727096507.65799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096507.65889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096507.65921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096507.65934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096507.65952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096507.66020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096507.67909: stdout chunk (state=3): >>>ansible-tmp-1727096507.6517818-29264-261919948809778=/root/.ansible/tmp/ansible-tmp-1727096507.6517818-29264-261919948809778 <<< 27712 1727096507.68013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096507.68075: stderr chunk (state=3): >>><<< 27712 1727096507.68085: stdout chunk (state=3): >>><<< 27712 1727096507.68107: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096507.6517818-29264-261919948809778=/root/.ansible/tmp/ansible-tmp-1727096507.6517818-29264-261919948809778 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096507.68273: variable 'ansible_module_compression' from source: unknown 27712 1727096507.68276: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096507.68278: variable 'ansible_facts' from source: unknown 27712 1727096507.68330: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096507.6517818-29264-261919948809778/AnsiballZ_command.py 27712 1727096507.68490: Sending initial data 27712 1727096507.68506: Sent initial data (156 bytes) 27712 1727096507.69105: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096507.69174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096507.69226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096507.69248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096507.69281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096507.69332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096507.70897: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096507.70956: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096507.70996: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmp1rgkig80 /root/.ansible/tmp/ansible-tmp-1727096507.6517818-29264-261919948809778/AnsiballZ_command.py <<< 27712 1727096507.71024: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096507.6517818-29264-261919948809778/AnsiballZ_command.py" <<< 27712 1727096507.71028: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 27712 1727096507.71059: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmp1rgkig80" to remote "/root/.ansible/tmp/ansible-tmp-1727096507.6517818-29264-261919948809778/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096507.6517818-29264-261919948809778/AnsiballZ_command.py" <<< 27712 1727096507.71700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096507.71824: stderr chunk (state=3): >>><<< 27712 1727096507.71833: stdout chunk (state=3): >>><<< 27712 1727096507.71846: done transferring module to remote 27712 1727096507.71862: _low_level_execute_command(): starting 27712 1727096507.71874: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096507.6517818-29264-261919948809778/ /root/.ansible/tmp/ansible-tmp-1727096507.6517818-29264-261919948809778/AnsiballZ_command.py && sleep 0' 27712 1727096507.72651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096507.72710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096507.72723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096507.72816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096507.72838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096507.72898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096507.74731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096507.74739: stdout chunk (state=3): >>><<< 27712 1727096507.74923: stderr chunk (state=3): >>><<< 27712 1727096507.74932: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096507.74934: _low_level_execute_command(): starting 27712 1727096507.74937: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096507.6517818-29264-261919948809778/AnsiballZ_command.py && sleep 0' 27712 1727096507.75982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096507.76038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096507.76113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096507.76163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096507.76208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096507.92822: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest1"], "start": "2024-09-23 09:01:47.913559", "end": "2024-09-23 09:01:47.926416", "delta": "0:00:00.012857", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096507.95048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096507.95090: stderr chunk (state=3): >>><<< 27712 1727096507.95098: stdout chunk (state=3): >>><<< 27712 1727096507.95128: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest1"], "start": "2024-09-23 09:01:47.913559", "end": "2024-09-23 09:01:47.926416", "delta": "0:00:00.012857", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096507.95181: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096507.6517818-29264-261919948809778/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096507.95197: _low_level_execute_command(): starting 27712 1727096507.95200: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096507.6517818-29264-261919948809778/ > /dev/null 2>&1 && sleep 0' 27712 1727096507.96414: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096507.96418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096507.96421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096507.96423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096507.96425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096507.96524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096507.96531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096507.96534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096507.96723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096507.98650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096507.98680: stderr chunk (state=3): >>><<< 27712 1727096507.98684: stdout chunk (state=3): >>><<< 27712 1727096507.98701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096507.98708: handler run complete 27712 1727096507.98724: Evaluated conditional (False): False 27712 1727096507.98732: attempt loop complete, returning result 27712 1727096507.98735: _execute() done 27712 1727096507.98737: dumping result to json 27712 1727096507.98753: done dumping result, returning 27712 1727096507.98782: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [0afff68d-5257-cbc7-8716-000000000777] 27712 1727096507.98785: sending task result for task 0afff68d-5257-cbc7-8716-000000000777 ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest1" ], "delta": "0:00:00.012857", "end": "2024-09-23 09:01:47.926416", "rc": 0, "start": "2024-09-23 09:01:47.913559" } 27712 1727096507.98984: no more pending results, returning what we have 27712 1727096507.98988: results queue empty 27712 1727096507.98991: checking for any_errors_fatal 27712 1727096507.98993: done checking for any_errors_fatal 27712 1727096507.98994: checking for max_fail_percentage 27712 1727096507.98995: done checking for max_fail_percentage 27712 1727096507.98996: checking to see if all hosts have failed and the running result is not ok 27712 1727096507.99000: done checking to see if all hosts have failed 27712 1727096507.99001: getting the remaining hosts for this loop 27712 1727096507.99003: done getting the remaining hosts for this loop 27712 1727096507.99007: getting the next task for host managed_node2 27712 1727096507.99018: done getting next task for host managed_node2 27712 1727096507.99022: ^ task is: TASK: Assert interface1 is absent 27712 1727096507.99026: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096507.99033: getting variables 27712 1727096507.99035: in VariableManager get_vars() 27712 1727096507.99091: Calling all_inventory to load vars for managed_node2 27712 1727096507.99094: Calling groups_inventory to load vars for managed_node2 27712 1727096507.99096: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096507.99108: Calling all_plugins_play to load vars for managed_node2 27712 1727096507.99111: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096507.99113: Calling groups_plugins_play to load vars for managed_node2 27712 1727096508.04642: done sending task result for task 0afff68d-5257-cbc7-8716-000000000777 27712 1727096508.04646: WORKER PROCESS EXITING 27712 1727096508.06828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096508.09002: done with get_vars() 27712 1727096508.09033: done getting variables TASK [Assert interface1 is absent] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:153 Monday 23 September 2024 09:01:48 -0400 (0:00:00.490) 0:00:33.784 ****** 27712 1727096508.09116: entering _queue_task() for managed_node2/include_tasks 27712 1727096508.09848: worker is 1 (out of 1 available) 27712 1727096508.09861: exiting _queue_task() for managed_node2/include_tasks 27712 1727096508.09873: done queuing things up, now waiting for results queue to drain 27712 1727096508.09875: waiting for pending results... 27712 1727096508.10078: running TaskExecutor() for managed_node2/TASK: Assert interface1 is absent 27712 1727096508.10202: in run() - task 0afff68d-5257-cbc7-8716-0000000000b6 27712 1727096508.10218: variable 'ansible_search_path' from source: unknown 27712 1727096508.10291: calling self._execute() 27712 1727096508.10361: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096508.10365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096508.10371: variable 'omit' from source: magic vars 27712 1727096508.10830: variable 'ansible_distribution_major_version' from source: facts 27712 1727096508.10840: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096508.10845: _execute() done 27712 1727096508.10896: dumping result to json 27712 1727096508.10900: done dumping result, returning 27712 1727096508.10903: done running TaskExecutor() for managed_node2/TASK: Assert interface1 is absent [0afff68d-5257-cbc7-8716-0000000000b6] 27712 1727096508.10905: sending task result for task 0afff68d-5257-cbc7-8716-0000000000b6 27712 1727096508.10984: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000b6 27712 1727096508.10987: WORKER PROCESS EXITING 27712 1727096508.11018: no more pending results, returning what we have 27712 1727096508.11023: in VariableManager get_vars() 27712 1727096508.11077: Calling all_inventory to load vars for managed_node2 27712 1727096508.11080: Calling groups_inventory to load vars for managed_node2 27712 1727096508.11082: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096508.11098: Calling all_plugins_play to load vars for managed_node2 27712 1727096508.11102: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096508.11105: Calling groups_plugins_play to load vars for managed_node2 27712 1727096508.12658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096508.14321: done with get_vars() 27712 1727096508.14340: variable 'ansible_search_path' from source: unknown 27712 1727096508.14355: we have included files to process 27712 1727096508.14356: generating all_blocks data 27712 1727096508.14361: done generating all_blocks data 27712 1727096508.14366: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27712 1727096508.14391: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27712 1727096508.14396: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27712 1727096508.14659: in VariableManager get_vars() 27712 1727096508.14690: done with get_vars() 27712 1727096508.14841: done processing included file 27712 1727096508.14843: iterating over new_blocks loaded from include file 27712 1727096508.14845: in VariableManager get_vars() 27712 1727096508.14864: done with get_vars() 27712 1727096508.14866: filtering new block on tags 27712 1727096508.14905: done filtering new block on tags 27712 1727096508.14907: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 27712 1727096508.14913: extending task lists for all hosts with included blocks 27712 1727096508.16524: done extending task lists 27712 1727096508.16526: done processing included files 27712 1727096508.16526: results queue empty 27712 1727096508.16527: checking for any_errors_fatal 27712 1727096508.16533: done checking for any_errors_fatal 27712 1727096508.16534: checking for max_fail_percentage 27712 1727096508.16535: done checking for max_fail_percentage 27712 1727096508.16535: checking to see if all hosts have failed and the running result is not ok 27712 1727096508.16536: done checking to see if all hosts have failed 27712 1727096508.16537: getting the remaining hosts for this loop 27712 1727096508.16538: done getting the remaining hosts for this loop 27712 1727096508.16541: getting the next task for host managed_node2 27712 1727096508.16545: done getting next task for host managed_node2 27712 1727096508.16547: ^ task is: TASK: Include the task 'get_interface_stat.yml' 27712 1727096508.16550: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096508.16553: getting variables 27712 1727096508.16554: in VariableManager get_vars() 27712 1727096508.16570: Calling all_inventory to load vars for managed_node2 27712 1727096508.16572: Calling groups_inventory to load vars for managed_node2 27712 1727096508.16574: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096508.16580: Calling all_plugins_play to load vars for managed_node2 27712 1727096508.16583: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096508.16586: Calling groups_plugins_play to load vars for managed_node2 27712 1727096508.18072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096508.20009: done with get_vars() 27712 1727096508.20031: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Monday 23 September 2024 09:01:48 -0400 (0:00:00.109) 0:00:33.894 ****** 27712 1727096508.20115: entering _queue_task() for managed_node2/include_tasks 27712 1727096508.20484: worker is 1 (out of 1 available) 27712 1727096508.20497: exiting _queue_task() for managed_node2/include_tasks 27712 1727096508.20510: done queuing things up, now waiting for results queue to drain 27712 1727096508.20511: waiting for pending results... 27712 1727096508.21042: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 27712 1727096508.21049: in run() - task 0afff68d-5257-cbc7-8716-000000000816 27712 1727096508.21052: variable 'ansible_search_path' from source: unknown 27712 1727096508.21054: variable 'ansible_search_path' from source: unknown 27712 1727096508.21057: calling self._execute() 27712 1727096508.21133: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096508.21137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096508.21147: variable 'omit' from source: magic vars 27712 1727096508.21676: variable 'ansible_distribution_major_version' from source: facts 27712 1727096508.21687: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096508.21690: _execute() done 27712 1727096508.21692: dumping result to json 27712 1727096508.21695: done dumping result, returning 27712 1727096508.21698: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-cbc7-8716-000000000816] 27712 1727096508.21703: sending task result for task 0afff68d-5257-cbc7-8716-000000000816 27712 1727096508.21889: done sending task result for task 0afff68d-5257-cbc7-8716-000000000816 27712 1727096508.21896: WORKER PROCESS EXITING 27712 1727096508.21935: no more pending results, returning what we have 27712 1727096508.21940: in VariableManager get_vars() 27712 1727096508.21992: Calling all_inventory to load vars for managed_node2 27712 1727096508.21996: Calling groups_inventory to load vars for managed_node2 27712 1727096508.21999: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096508.22014: Calling all_plugins_play to load vars for managed_node2 27712 1727096508.22018: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096508.22021: Calling groups_plugins_play to load vars for managed_node2 27712 1727096508.23850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096508.25779: done with get_vars() 27712 1727096508.25814: variable 'ansible_search_path' from source: unknown 27712 1727096508.25815: variable 'ansible_search_path' from source: unknown 27712 1727096508.25853: we have included files to process 27712 1727096508.25854: generating all_blocks data 27712 1727096508.25855: done generating all_blocks data 27712 1727096508.25856: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27712 1727096508.25857: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27712 1727096508.25859: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27712 1727096508.26052: done processing included file 27712 1727096508.26054: iterating over new_blocks loaded from include file 27712 1727096508.26056: in VariableManager get_vars() 27712 1727096508.26080: done with get_vars() 27712 1727096508.26082: filtering new block on tags 27712 1727096508.26107: done filtering new block on tags 27712 1727096508.26110: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 27712 1727096508.26115: extending task lists for all hosts with included blocks 27712 1727096508.26237: done extending task lists 27712 1727096508.26238: done processing included files 27712 1727096508.26239: results queue empty 27712 1727096508.26240: checking for any_errors_fatal 27712 1727096508.26242: done checking for any_errors_fatal 27712 1727096508.26243: checking for max_fail_percentage 27712 1727096508.26244: done checking for max_fail_percentage 27712 1727096508.26245: checking to see if all hosts have failed and the running result is not ok 27712 1727096508.26246: done checking to see if all hosts have failed 27712 1727096508.26246: getting the remaining hosts for this loop 27712 1727096508.26248: done getting the remaining hosts for this loop 27712 1727096508.26251: getting the next task for host managed_node2 27712 1727096508.26255: done getting next task for host managed_node2 27712 1727096508.26257: ^ task is: TASK: Get stat for interface {{ interface }} 27712 1727096508.26260: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096508.26263: getting variables 27712 1727096508.26264: in VariableManager get_vars() 27712 1727096508.26281: Calling all_inventory to load vars for managed_node2 27712 1727096508.26283: Calling groups_inventory to load vars for managed_node2 27712 1727096508.26285: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096508.26290: Calling all_plugins_play to load vars for managed_node2 27712 1727096508.26293: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096508.26296: Calling groups_plugins_play to load vars for managed_node2 27712 1727096508.27517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096508.29141: done with get_vars() 27712 1727096508.29160: done getting variables 27712 1727096508.29325: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest1] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 09:01:48 -0400 (0:00:00.092) 0:00:33.986 ****** 27712 1727096508.29355: entering _queue_task() for managed_node2/stat 27712 1727096508.29801: worker is 1 (out of 1 available) 27712 1727096508.29813: exiting _queue_task() for managed_node2/stat 27712 1727096508.29823: done queuing things up, now waiting for results queue to drain 27712 1727096508.29825: waiting for pending results... 27712 1727096508.30071: running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest1 27712 1727096508.30165: in run() - task 0afff68d-5257-cbc7-8716-0000000008bc 27712 1727096508.30172: variable 'ansible_search_path' from source: unknown 27712 1727096508.30175: variable 'ansible_search_path' from source: unknown 27712 1727096508.30274: calling self._execute() 27712 1727096508.30311: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096508.30317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096508.30329: variable 'omit' from source: magic vars 27712 1727096508.30704: variable 'ansible_distribution_major_version' from source: facts 27712 1727096508.30717: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096508.30723: variable 'omit' from source: magic vars 27712 1727096508.30781: variable 'omit' from source: magic vars 27712 1727096508.30874: variable 'interface' from source: set_fact 27712 1727096508.30895: variable 'omit' from source: magic vars 27712 1727096508.30936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096508.30971: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096508.30997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096508.31015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096508.31030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096508.31055: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096508.31058: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096508.31061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096508.31167: Set connection var ansible_connection to ssh 27712 1727096508.31179: Set connection var ansible_pipelining to False 27712 1727096508.31185: Set connection var ansible_timeout to 10 27712 1727096508.31188: Set connection var ansible_shell_type to sh 27712 1727096508.31202: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096508.31205: Set connection var ansible_shell_executable to /bin/sh 27712 1727096508.31228: variable 'ansible_shell_executable' from source: unknown 27712 1727096508.31232: variable 'ansible_connection' from source: unknown 27712 1727096508.31236: variable 'ansible_module_compression' from source: unknown 27712 1727096508.31351: variable 'ansible_shell_type' from source: unknown 27712 1727096508.31357: variable 'ansible_shell_executable' from source: unknown 27712 1727096508.31361: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096508.31365: variable 'ansible_pipelining' from source: unknown 27712 1727096508.31369: variable 'ansible_timeout' from source: unknown 27712 1727096508.31372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096508.31461: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096508.31471: variable 'omit' from source: magic vars 27712 1727096508.31571: starting attempt loop 27712 1727096508.31580: running the handler 27712 1727096508.31583: _low_level_execute_command(): starting 27712 1727096508.31586: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096508.32290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096508.32336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096508.32360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096508.32404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096508.32438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096508.34127: stdout chunk (state=3): >>>/root <<< 27712 1727096508.34272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096508.34295: stdout chunk (state=3): >>><<< 27712 1727096508.34298: stderr chunk (state=3): >>><<< 27712 1727096508.34316: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096508.34416: _low_level_execute_command(): starting 27712 1727096508.34420: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096508.3433058-29292-27070113958131 `" && echo ansible-tmp-1727096508.3433058-29292-27070113958131="` echo /root/.ansible/tmp/ansible-tmp-1727096508.3433058-29292-27070113958131 `" ) && sleep 0' 27712 1727096508.34964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096508.34982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096508.34996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096508.35025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096508.35134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096508.35192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096508.35224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096508.37139: stdout chunk (state=3): >>>ansible-tmp-1727096508.3433058-29292-27070113958131=/root/.ansible/tmp/ansible-tmp-1727096508.3433058-29292-27070113958131 <<< 27712 1727096508.37282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096508.37297: stdout chunk (state=3): >>><<< 27712 1727096508.37310: stderr chunk (state=3): >>><<< 27712 1727096508.37338: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096508.3433058-29292-27070113958131=/root/.ansible/tmp/ansible-tmp-1727096508.3433058-29292-27070113958131 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096508.37392: variable 'ansible_module_compression' from source: unknown 27712 1727096508.37464: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27712 1727096508.37508: variable 'ansible_facts' from source: unknown 27712 1727096508.37611: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096508.3433058-29292-27070113958131/AnsiballZ_stat.py 27712 1727096508.37840: Sending initial data 27712 1727096508.37859: Sent initial data (152 bytes) 27712 1727096508.38398: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096508.38482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096508.38515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096508.38532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096508.38554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096508.38625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096508.40202: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096508.40258: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096508.40305: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpz4x572y4 /root/.ansible/tmp/ansible-tmp-1727096508.3433058-29292-27070113958131/AnsiballZ_stat.py <<< 27712 1727096508.40309: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096508.3433058-29292-27070113958131/AnsiballZ_stat.py" <<< 27712 1727096508.40339: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpz4x572y4" to remote "/root/.ansible/tmp/ansible-tmp-1727096508.3433058-29292-27070113958131/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096508.3433058-29292-27070113958131/AnsiballZ_stat.py" <<< 27712 1727096508.41085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096508.41239: stderr chunk (state=3): >>><<< 27712 1727096508.41243: stdout chunk (state=3): >>><<< 27712 1727096508.41246: done transferring module to remote 27712 1727096508.41249: _low_level_execute_command(): starting 27712 1727096508.41252: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096508.3433058-29292-27070113958131/ /root/.ansible/tmp/ansible-tmp-1727096508.3433058-29292-27070113958131/AnsiballZ_stat.py && sleep 0' 27712 1727096508.42030: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096508.42124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096508.42149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096508.42214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096508.44018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096508.44076: stderr chunk (state=3): >>><<< 27712 1727096508.44198: stdout chunk (state=3): >>><<< 27712 1727096508.44202: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096508.44210: _low_level_execute_command(): starting 27712 1727096508.44213: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096508.3433058-29292-27070113958131/AnsiballZ_stat.py && sleep 0' 27712 1727096508.44837: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096508.44899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096508.44926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096508.44993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096508.60520: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27712 1727096508.61977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096508.61982: stdout chunk (state=3): >>><<< 27712 1727096508.61985: stderr chunk (state=3): >>><<< 27712 1727096508.61987: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096508.61990: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096508.3433058-29292-27070113958131/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096508.61992: _low_level_execute_command(): starting 27712 1727096508.61994: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096508.3433058-29292-27070113958131/ > /dev/null 2>&1 && sleep 0' 27712 1727096508.62779: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096508.62786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096508.62797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096508.62812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096508.62830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096508.62841: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096508.62875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096508.62878: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096508.63070: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096508.63088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096508.63091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096508.63094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096508.63110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096508.65098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096508.65120: stdout chunk (state=3): >>><<< 27712 1727096508.65145: stderr chunk (state=3): >>><<< 27712 1727096508.65386: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096508.65390: handler run complete 27712 1727096508.65396: attempt loop complete, returning result 27712 1727096508.65398: _execute() done 27712 1727096508.65400: dumping result to json 27712 1727096508.65403: done dumping result, returning 27712 1727096508.65405: done running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest1 [0afff68d-5257-cbc7-8716-0000000008bc] 27712 1727096508.65407: sending task result for task 0afff68d-5257-cbc7-8716-0000000008bc 27712 1727096508.65491: done sending task result for task 0afff68d-5257-cbc7-8716-0000000008bc 27712 1727096508.65495: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 27712 1727096508.65570: no more pending results, returning what we have 27712 1727096508.65574: results queue empty 27712 1727096508.65575: checking for any_errors_fatal 27712 1727096508.65585: done checking for any_errors_fatal 27712 1727096508.65586: checking for max_fail_percentage 27712 1727096508.65588: done checking for max_fail_percentage 27712 1727096508.65589: checking to see if all hosts have failed and the running result is not ok 27712 1727096508.65590: done checking to see if all hosts have failed 27712 1727096508.65591: getting the remaining hosts for this loop 27712 1727096508.65593: done getting the remaining hosts for this loop 27712 1727096508.65596: getting the next task for host managed_node2 27712 1727096508.65606: done getting next task for host managed_node2 27712 1727096508.65608: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 27712 1727096508.65613: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096508.65619: getting variables 27712 1727096508.65620: in VariableManager get_vars() 27712 1727096508.65984: Calling all_inventory to load vars for managed_node2 27712 1727096508.65987: Calling groups_inventory to load vars for managed_node2 27712 1727096508.65990: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096508.66001: Calling all_plugins_play to load vars for managed_node2 27712 1727096508.66004: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096508.66007: Calling groups_plugins_play to load vars for managed_node2 27712 1727096508.69375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096508.71498: done with get_vars() 27712 1727096508.71530: done getting variables 27712 1727096508.71647: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096508.71997: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest1'] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Monday 23 September 2024 09:01:48 -0400 (0:00:00.426) 0:00:34.413 ****** 27712 1727096508.72032: entering _queue_task() for managed_node2/assert 27712 1727096508.73094: worker is 1 (out of 1 available) 27712 1727096508.73107: exiting _queue_task() for managed_node2/assert 27712 1727096508.73119: done queuing things up, now waiting for results queue to drain 27712 1727096508.73120: waiting for pending results... 27712 1727096508.73954: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'ethtest1' 27712 1727096508.74356: in run() - task 0afff68d-5257-cbc7-8716-000000000817 27712 1727096508.74478: variable 'ansible_search_path' from source: unknown 27712 1727096508.74482: variable 'ansible_search_path' from source: unknown 27712 1727096508.74662: calling self._execute() 27712 1727096508.75078: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096508.75081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096508.75085: variable 'omit' from source: magic vars 27712 1727096508.76742: variable 'ansible_distribution_major_version' from source: facts 27712 1727096508.76747: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096508.76751: variable 'omit' from source: magic vars 27712 1727096508.76778: variable 'omit' from source: magic vars 27712 1727096508.77071: variable 'interface' from source: set_fact 27712 1727096508.77177: variable 'omit' from source: magic vars 27712 1727096508.77327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096508.77340: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096508.77455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096508.77458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096508.77460: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096508.77657: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096508.77661: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096508.77664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096508.77896: Set connection var ansible_connection to ssh 27712 1727096508.77997: Set connection var ansible_pipelining to False 27712 1727096508.78101: Set connection var ansible_timeout to 10 27712 1727096508.78105: Set connection var ansible_shell_type to sh 27712 1727096508.78108: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096508.78110: Set connection var ansible_shell_executable to /bin/sh 27712 1727096508.78112: variable 'ansible_shell_executable' from source: unknown 27712 1727096508.78118: variable 'ansible_connection' from source: unknown 27712 1727096508.78120: variable 'ansible_module_compression' from source: unknown 27712 1727096508.78122: variable 'ansible_shell_type' from source: unknown 27712 1727096508.78124: variable 'ansible_shell_executable' from source: unknown 27712 1727096508.78126: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096508.78128: variable 'ansible_pipelining' from source: unknown 27712 1727096508.78131: variable 'ansible_timeout' from source: unknown 27712 1727096508.78134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096508.78447: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096508.78665: variable 'omit' from source: magic vars 27712 1727096508.78670: starting attempt loop 27712 1727096508.78673: running the handler 27712 1727096508.78884: variable 'interface_stat' from source: set_fact 27712 1727096508.79052: Evaluated conditional (not interface_stat.stat.exists): True 27712 1727096508.79062: handler run complete 27712 1727096508.79086: attempt loop complete, returning result 27712 1727096508.79095: _execute() done 27712 1727096508.79110: dumping result to json 27712 1727096508.79119: done dumping result, returning 27712 1727096508.79130: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'ethtest1' [0afff68d-5257-cbc7-8716-000000000817] 27712 1727096508.79155: sending task result for task 0afff68d-5257-cbc7-8716-000000000817 27712 1727096508.79647: done sending task result for task 0afff68d-5257-cbc7-8716-000000000817 27712 1727096508.79651: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27712 1727096508.79713: no more pending results, returning what we have 27712 1727096508.79718: results queue empty 27712 1727096508.79720: checking for any_errors_fatal 27712 1727096508.79729: done checking for any_errors_fatal 27712 1727096508.79730: checking for max_fail_percentage 27712 1727096508.79732: done checking for max_fail_percentage 27712 1727096508.79733: checking to see if all hosts have failed and the running result is not ok 27712 1727096508.79733: done checking to see if all hosts have failed 27712 1727096508.79734: getting the remaining hosts for this loop 27712 1727096508.79736: done getting the remaining hosts for this loop 27712 1727096508.79740: getting the next task for host managed_node2 27712 1727096508.79749: done getting next task for host managed_node2 27712 1727096508.79752: ^ task is: TASK: Set interface0 27712 1727096508.79756: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096508.79761: getting variables 27712 1727096508.79763: in VariableManager get_vars() 27712 1727096508.79811: Calling all_inventory to load vars for managed_node2 27712 1727096508.79814: Calling groups_inventory to load vars for managed_node2 27712 1727096508.79817: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096508.79828: Calling all_plugins_play to load vars for managed_node2 27712 1727096508.79832: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096508.79835: Calling groups_plugins_play to load vars for managed_node2 27712 1727096508.83752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096508.87087: done with get_vars() 27712 1727096508.87117: done getting variables 27712 1727096508.87344: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set interface0] ********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:155 Monday 23 September 2024 09:01:48 -0400 (0:00:00.153) 0:00:34.567 ****** 27712 1727096508.87418: entering _queue_task() for managed_node2/set_fact 27712 1727096508.88127: worker is 1 (out of 1 available) 27712 1727096508.88140: exiting _queue_task() for managed_node2/set_fact 27712 1727096508.88151: done queuing things up, now waiting for results queue to drain 27712 1727096508.88152: waiting for pending results... 27712 1727096508.88958: running TaskExecutor() for managed_node2/TASK: Set interface0 27712 1727096508.89160: in run() - task 0afff68d-5257-cbc7-8716-0000000000b7 27712 1727096508.89486: variable 'ansible_search_path' from source: unknown 27712 1727096508.89521: calling self._execute() 27712 1727096508.89619: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096508.89625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096508.89635: variable 'omit' from source: magic vars 27712 1727096508.90809: variable 'ansible_distribution_major_version' from source: facts 27712 1727096508.90822: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096508.90830: variable 'omit' from source: magic vars 27712 1727096508.90869: variable 'omit' from source: magic vars 27712 1727096508.91301: variable 'interface0' from source: play vars 27712 1727096508.91381: variable 'interface0' from source: play vars 27712 1727096508.91400: variable 'omit' from source: magic vars 27712 1727096508.91440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096508.91880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096508.91902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096508.91920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096508.91931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096508.91959: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096508.91962: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096508.91965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096508.92066: Set connection var ansible_connection to ssh 27712 1727096508.92482: Set connection var ansible_pipelining to False 27712 1727096508.92487: Set connection var ansible_timeout to 10 27712 1727096508.92491: Set connection var ansible_shell_type to sh 27712 1727096508.92498: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096508.92504: Set connection var ansible_shell_executable to /bin/sh 27712 1727096508.92526: variable 'ansible_shell_executable' from source: unknown 27712 1727096508.92530: variable 'ansible_connection' from source: unknown 27712 1727096508.92533: variable 'ansible_module_compression' from source: unknown 27712 1727096508.92535: variable 'ansible_shell_type' from source: unknown 27712 1727096508.92538: variable 'ansible_shell_executable' from source: unknown 27712 1727096508.92540: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096508.92542: variable 'ansible_pipelining' from source: unknown 27712 1727096508.92544: variable 'ansible_timeout' from source: unknown 27712 1727096508.92549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096508.93098: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096508.93110: variable 'omit' from source: magic vars 27712 1727096508.93116: starting attempt loop 27712 1727096508.93119: running the handler 27712 1727096508.93130: handler run complete 27712 1727096508.93141: attempt loop complete, returning result 27712 1727096508.93144: _execute() done 27712 1727096508.93146: dumping result to json 27712 1727096508.93149: done dumping result, returning 27712 1727096508.93158: done running TaskExecutor() for managed_node2/TASK: Set interface0 [0afff68d-5257-cbc7-8716-0000000000b7] 27712 1727096508.93160: sending task result for task 0afff68d-5257-cbc7-8716-0000000000b7 27712 1727096508.93262: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000b7 27712 1727096508.93264: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "interface": "ethtest0" }, "changed": false } 27712 1727096508.93357: no more pending results, returning what we have 27712 1727096508.93361: results queue empty 27712 1727096508.93362: checking for any_errors_fatal 27712 1727096508.93574: done checking for any_errors_fatal 27712 1727096508.93576: checking for max_fail_percentage 27712 1727096508.93578: done checking for max_fail_percentage 27712 1727096508.93579: checking to see if all hosts have failed and the running result is not ok 27712 1727096508.93580: done checking to see if all hosts have failed 27712 1727096508.93581: getting the remaining hosts for this loop 27712 1727096508.93583: done getting the remaining hosts for this loop 27712 1727096508.93587: getting the next task for host managed_node2 27712 1727096508.93594: done getting next task for host managed_node2 27712 1727096508.93597: ^ task is: TASK: Delete interface0 27712 1727096508.93601: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096508.93606: getting variables 27712 1727096508.93608: in VariableManager get_vars() 27712 1727096508.93651: Calling all_inventory to load vars for managed_node2 27712 1727096508.93653: Calling groups_inventory to load vars for managed_node2 27712 1727096508.93656: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096508.93675: Calling all_plugins_play to load vars for managed_node2 27712 1727096508.93680: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096508.93683: Calling groups_plugins_play to load vars for managed_node2 27712 1727096508.95479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096508.98297: done with get_vars() 27712 1727096508.98325: done getting variables TASK [Delete interface0] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:158 Monday 23 September 2024 09:01:48 -0400 (0:00:00.110) 0:00:34.677 ****** 27712 1727096508.98437: entering _queue_task() for managed_node2/include_tasks 27712 1727096508.98909: worker is 1 (out of 1 available) 27712 1727096508.98921: exiting _queue_task() for managed_node2/include_tasks 27712 1727096508.98931: done queuing things up, now waiting for results queue to drain 27712 1727096508.98933: waiting for pending results... 27712 1727096508.99231: running TaskExecutor() for managed_node2/TASK: Delete interface0 27712 1727096508.99239: in run() - task 0afff68d-5257-cbc7-8716-0000000000b8 27712 1727096508.99264: variable 'ansible_search_path' from source: unknown 27712 1727096508.99308: calling self._execute() 27712 1727096508.99414: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096508.99473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096508.99482: variable 'omit' from source: magic vars 27712 1727096508.99846: variable 'ansible_distribution_major_version' from source: facts 27712 1727096508.99873: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096508.99884: _execute() done 27712 1727096508.99893: dumping result to json 27712 1727096508.99899: done dumping result, returning 27712 1727096508.99916: done running TaskExecutor() for managed_node2/TASK: Delete interface0 [0afff68d-5257-cbc7-8716-0000000000b8] 27712 1727096508.99977: sending task result for task 0afff68d-5257-cbc7-8716-0000000000b8 27712 1727096509.00191: no more pending results, returning what we have 27712 1727096509.00197: in VariableManager get_vars() 27712 1727096509.00250: Calling all_inventory to load vars for managed_node2 27712 1727096509.00253: Calling groups_inventory to load vars for managed_node2 27712 1727096509.00256: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096509.00271: Calling all_plugins_play to load vars for managed_node2 27712 1727096509.00275: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096509.00278: Calling groups_plugins_play to load vars for managed_node2 27712 1727096509.01074: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000b8 27712 1727096509.01078: WORKER PROCESS EXITING 27712 1727096509.03613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096509.05511: done with get_vars() 27712 1727096509.05529: variable 'ansible_search_path' from source: unknown 27712 1727096509.05607: we have included files to process 27712 1727096509.05608: generating all_blocks data 27712 1727096509.05610: done generating all_blocks data 27712 1727096509.05613: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27712 1727096509.05614: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27712 1727096509.05616: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 27712 1727096509.05845: done processing included file 27712 1727096509.05848: iterating over new_blocks loaded from include file 27712 1727096509.05849: in VariableManager get_vars() 27712 1727096509.06070: done with get_vars() 27712 1727096509.06073: filtering new block on tags 27712 1727096509.06108: done filtering new block on tags 27712 1727096509.06111: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 27712 1727096509.06117: extending task lists for all hosts with included blocks 27712 1727096509.09294: done extending task lists 27712 1727096509.09296: done processing included files 27712 1727096509.09296: results queue empty 27712 1727096509.09297: checking for any_errors_fatal 27712 1727096509.09300: done checking for any_errors_fatal 27712 1727096509.09301: checking for max_fail_percentage 27712 1727096509.09302: done checking for max_fail_percentage 27712 1727096509.09303: checking to see if all hosts have failed and the running result is not ok 27712 1727096509.09303: done checking to see if all hosts have failed 27712 1727096509.09304: getting the remaining hosts for this loop 27712 1727096509.09305: done getting the remaining hosts for this loop 27712 1727096509.09308: getting the next task for host managed_node2 27712 1727096509.09312: done getting next task for host managed_node2 27712 1727096509.09314: ^ task is: TASK: Remove test interface if necessary 27712 1727096509.09317: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096509.09319: getting variables 27712 1727096509.09320: in VariableManager get_vars() 27712 1727096509.09334: Calling all_inventory to load vars for managed_node2 27712 1727096509.09336: Calling groups_inventory to load vars for managed_node2 27712 1727096509.09338: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096509.09344: Calling all_plugins_play to load vars for managed_node2 27712 1727096509.09346: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096509.09348: Calling groups_plugins_play to load vars for managed_node2 27712 1727096509.10993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096509.13699: done with get_vars() 27712 1727096509.13718: done getting variables 27712 1727096509.13760: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Monday 23 September 2024 09:01:49 -0400 (0:00:00.153) 0:00:34.831 ****** 27712 1727096509.13797: entering _queue_task() for managed_node2/command 27712 1727096509.14147: worker is 1 (out of 1 available) 27712 1727096509.14159: exiting _queue_task() for managed_node2/command 27712 1727096509.14318: done queuing things up, now waiting for results queue to drain 27712 1727096509.14320: waiting for pending results... 27712 1727096509.14471: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 27712 1727096509.14598: in run() - task 0afff68d-5257-cbc7-8716-0000000008da 27712 1727096509.14616: variable 'ansible_search_path' from source: unknown 27712 1727096509.14623: variable 'ansible_search_path' from source: unknown 27712 1727096509.14673: calling self._execute() 27712 1727096509.14779: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096509.14791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096509.14805: variable 'omit' from source: magic vars 27712 1727096509.15196: variable 'ansible_distribution_major_version' from source: facts 27712 1727096509.15214: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096509.15225: variable 'omit' from source: magic vars 27712 1727096509.15276: variable 'omit' from source: magic vars 27712 1727096509.15378: variable 'interface' from source: set_fact 27712 1727096509.15408: variable 'omit' from source: magic vars 27712 1727096509.15454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096509.15520: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096509.15526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096509.15547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096509.15562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096509.15627: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096509.15632: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096509.15637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096509.15725: Set connection var ansible_connection to ssh 27712 1727096509.15752: Set connection var ansible_pipelining to False 27712 1727096509.15845: Set connection var ansible_timeout to 10 27712 1727096509.15850: Set connection var ansible_shell_type to sh 27712 1727096509.15853: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096509.15855: Set connection var ansible_shell_executable to /bin/sh 27712 1727096509.15857: variable 'ansible_shell_executable' from source: unknown 27712 1727096509.15859: variable 'ansible_connection' from source: unknown 27712 1727096509.15861: variable 'ansible_module_compression' from source: unknown 27712 1727096509.15863: variable 'ansible_shell_type' from source: unknown 27712 1727096509.15865: variable 'ansible_shell_executable' from source: unknown 27712 1727096509.15880: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096509.15886: variable 'ansible_pipelining' from source: unknown 27712 1727096509.15888: variable 'ansible_timeout' from source: unknown 27712 1727096509.15890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096509.16345: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096509.16455: variable 'omit' from source: magic vars 27712 1727096509.16458: starting attempt loop 27712 1727096509.16461: running the handler 27712 1727096509.16548: _low_level_execute_command(): starting 27712 1727096509.16551: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096509.18125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096509.18448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096509.20157: stdout chunk (state=3): >>>/root <<< 27712 1727096509.20224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096509.20284: stderr chunk (state=3): >>><<< 27712 1727096509.20594: stdout chunk (state=3): >>><<< 27712 1727096509.20613: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096509.20625: _low_level_execute_command(): starting 27712 1727096509.20699: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096509.206124-29324-155995800033073 `" && echo ansible-tmp-1727096509.206124-29324-155995800033073="` echo /root/.ansible/tmp/ansible-tmp-1727096509.206124-29324-155995800033073 `" ) && sleep 0' 27712 1727096509.21689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096509.21801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096509.21812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096509.21826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096509.21985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096509.22003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096509.22132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096509.22142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096509.22343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096509.24166: stdout chunk (state=3): >>>ansible-tmp-1727096509.206124-29324-155995800033073=/root/.ansible/tmp/ansible-tmp-1727096509.206124-29324-155995800033073 <<< 27712 1727096509.24332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096509.24335: stdout chunk (state=3): >>><<< 27712 1727096509.24338: stderr chunk (state=3): >>><<< 27712 1727096509.24420: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096509.206124-29324-155995800033073=/root/.ansible/tmp/ansible-tmp-1727096509.206124-29324-155995800033073 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096509.24425: variable 'ansible_module_compression' from source: unknown 27712 1727096509.24452: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096509.24534: variable 'ansible_facts' from source: unknown 27712 1727096509.24669: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096509.206124-29324-155995800033073/AnsiballZ_command.py 27712 1727096509.24848: Sending initial data 27712 1727096509.24854: Sent initial data (155 bytes) 27712 1727096509.26101: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096509.26207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096509.26243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096509.26302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096509.26409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27712 1727096509.26575: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096509.26619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096509.26757: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096509.26792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096509.28266: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 27712 1727096509.28299: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096509.28322: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096509.28365: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096509.206124-29324-155995800033073/AnsiballZ_command.py" <<< 27712 1727096509.28370: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmplv45mygg /root/.ansible/tmp/ansible-tmp-1727096509.206124-29324-155995800033073/AnsiballZ_command.py <<< 27712 1727096509.28373: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmplv45mygg" to remote "/root/.ansible/tmp/ansible-tmp-1727096509.206124-29324-155995800033073/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096509.206124-29324-155995800033073/AnsiballZ_command.py" <<< 27712 1727096509.29160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096509.29164: stdout chunk (state=3): >>><<< 27712 1727096509.29166: stderr chunk (state=3): >>><<< 27712 1727096509.29171: done transferring module to remote 27712 1727096509.29173: _low_level_execute_command(): starting 27712 1727096509.29181: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096509.206124-29324-155995800033073/ /root/.ansible/tmp/ansible-tmp-1727096509.206124-29324-155995800033073/AnsiballZ_command.py && sleep 0' 27712 1727096509.30490: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096509.30580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096509.30774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096509.30805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096509.32642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096509.32649: stdout chunk (state=3): >>><<< 27712 1727096509.32652: stderr chunk (state=3): >>><<< 27712 1727096509.32680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096509.32683: _low_level_execute_command(): starting 27712 1727096509.32688: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096509.206124-29324-155995800033073/AnsiballZ_command.py && sleep 0' 27712 1727096509.33486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096509.33490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096509.33492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096509.33494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096509.33496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096509.33498: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096509.33499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096509.33501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096509.33503: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096509.33505: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27712 1727096509.33506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096509.33508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096509.33510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096509.33512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096509.33514: stderr chunk (state=3): >>>debug2: match found <<< 27712 1727096509.33515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096509.33660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096509.33663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096509.33666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096509.33669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096509.49893: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-23 09:01:49.488374", "end": "2024-09-23 09:01:49.495380", "delta": "0:00:00.007006", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096509.51699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096509.51704: stdout chunk (state=3): >>><<< 27712 1727096509.51707: stderr chunk (state=3): >>><<< 27712 1727096509.51712: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-23 09:01:49.488374", "end": "2024-09-23 09:01:49.495380", "delta": "0:00:00.007006", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096509.51716: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096509.206124-29324-155995800033073/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096509.51722: _low_level_execute_command(): starting 27712 1727096509.51725: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096509.206124-29324-155995800033073/ > /dev/null 2>&1 && sleep 0' 27712 1727096509.52397: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096509.52404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096509.52415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096509.52491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096509.52497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096509.52505: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096509.52512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096509.52515: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096509.52517: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096509.52519: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27712 1727096509.52609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096509.52612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096509.52614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096509.52617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096509.52619: stderr chunk (state=3): >>>debug2: match found <<< 27712 1727096509.52620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096509.52670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096509.52701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096509.52704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096509.52762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096509.54696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096509.54700: stdout chunk (state=3): >>><<< 27712 1727096509.54702: stderr chunk (state=3): >>><<< 27712 1727096509.54907: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096509.54911: handler run complete 27712 1727096509.54980: Evaluated conditional (False): False 27712 1727096509.54983: attempt loop complete, returning result 27712 1727096509.54985: _execute() done 27712 1727096509.54988: dumping result to json 27712 1727096509.54990: done dumping result, returning 27712 1727096509.54992: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [0afff68d-5257-cbc7-8716-0000000008da] 27712 1727096509.54994: sending task result for task 0afff68d-5257-cbc7-8716-0000000008da 27712 1727096509.55063: done sending task result for task 0afff68d-5257-cbc7-8716-0000000008da 27712 1727096509.55066: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.007006", "end": "2024-09-23 09:01:49.495380", "rc": 0, "start": "2024-09-23 09:01:49.488374" } 27712 1727096509.55142: no more pending results, returning what we have 27712 1727096509.55147: results queue empty 27712 1727096509.55148: checking for any_errors_fatal 27712 1727096509.55149: done checking for any_errors_fatal 27712 1727096509.55150: checking for max_fail_percentage 27712 1727096509.55151: done checking for max_fail_percentage 27712 1727096509.55152: checking to see if all hosts have failed and the running result is not ok 27712 1727096509.55153: done checking to see if all hosts have failed 27712 1727096509.55154: getting the remaining hosts for this loop 27712 1727096509.55155: done getting the remaining hosts for this loop 27712 1727096509.55158: getting the next task for host managed_node2 27712 1727096509.55166: done getting next task for host managed_node2 27712 1727096509.55174: ^ task is: TASK: Assert interface0 is absent 27712 1727096509.55178: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096509.55313: getting variables 27712 1727096509.55315: in VariableManager get_vars() 27712 1727096509.55352: Calling all_inventory to load vars for managed_node2 27712 1727096509.55355: Calling groups_inventory to load vars for managed_node2 27712 1727096509.55357: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096509.55390: Calling all_plugins_play to load vars for managed_node2 27712 1727096509.55396: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096509.55403: Calling groups_plugins_play to load vars for managed_node2 27712 1727096509.56857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096509.60502: done with get_vars() 27712 1727096509.60537: done getting variables TASK [Assert interface0 is absent] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:160 Monday 23 September 2024 09:01:49 -0400 (0:00:00.468) 0:00:35.299 ****** 27712 1727096509.60644: entering _queue_task() for managed_node2/include_tasks 27712 1727096509.61020: worker is 1 (out of 1 available) 27712 1727096509.61034: exiting _queue_task() for managed_node2/include_tasks 27712 1727096509.61045: done queuing things up, now waiting for results queue to drain 27712 1727096509.61047: waiting for pending results... 27712 1727096509.61559: running TaskExecutor() for managed_node2/TASK: Assert interface0 is absent 27712 1727096509.61564: in run() - task 0afff68d-5257-cbc7-8716-0000000000b9 27712 1727096509.61751: variable 'ansible_search_path' from source: unknown 27712 1727096509.61822: calling self._execute() 27712 1727096509.61994: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096509.62022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096509.62076: variable 'omit' from source: magic vars 27712 1727096509.62864: variable 'ansible_distribution_major_version' from source: facts 27712 1727096509.62884: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096509.62919: _execute() done 27712 1727096509.62929: dumping result to json 27712 1727096509.63021: done dumping result, returning 27712 1727096509.63024: done running TaskExecutor() for managed_node2/TASK: Assert interface0 is absent [0afff68d-5257-cbc7-8716-0000000000b9] 27712 1727096509.63027: sending task result for task 0afff68d-5257-cbc7-8716-0000000000b9 27712 1727096509.63105: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000b9 27712 1727096509.63109: WORKER PROCESS EXITING 27712 1727096509.63148: no more pending results, returning what we have 27712 1727096509.63154: in VariableManager get_vars() 27712 1727096509.63213: Calling all_inventory to load vars for managed_node2 27712 1727096509.63216: Calling groups_inventory to load vars for managed_node2 27712 1727096509.63219: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096509.63232: Calling all_plugins_play to load vars for managed_node2 27712 1727096509.63235: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096509.63238: Calling groups_plugins_play to load vars for managed_node2 27712 1727096509.65881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096509.67465: done with get_vars() 27712 1727096509.67488: variable 'ansible_search_path' from source: unknown 27712 1727096509.67502: we have included files to process 27712 1727096509.67503: generating all_blocks data 27712 1727096509.67505: done generating all_blocks data 27712 1727096509.67509: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27712 1727096509.67510: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27712 1727096509.67512: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 27712 1727096509.67621: in VariableManager get_vars() 27712 1727096509.67645: done with get_vars() 27712 1727096509.67754: done processing included file 27712 1727096509.67756: iterating over new_blocks loaded from include file 27712 1727096509.67757: in VariableManager get_vars() 27712 1727096509.67778: done with get_vars() 27712 1727096509.67780: filtering new block on tags 27712 1727096509.67813: done filtering new block on tags 27712 1727096509.67816: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 27712 1727096509.67823: extending task lists for all hosts with included blocks 27712 1727096509.70057: done extending task lists 27712 1727096509.70058: done processing included files 27712 1727096509.70059: results queue empty 27712 1727096509.70060: checking for any_errors_fatal 27712 1727096509.70064: done checking for any_errors_fatal 27712 1727096509.70065: checking for max_fail_percentage 27712 1727096509.70066: done checking for max_fail_percentage 27712 1727096509.70067: checking to see if all hosts have failed and the running result is not ok 27712 1727096509.70070: done checking to see if all hosts have failed 27712 1727096509.70073: getting the remaining hosts for this loop 27712 1727096509.70074: done getting the remaining hosts for this loop 27712 1727096509.70077: getting the next task for host managed_node2 27712 1727096509.70080: done getting next task for host managed_node2 27712 1727096509.70082: ^ task is: TASK: Include the task 'get_interface_stat.yml' 27712 1727096509.70086: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096509.70088: getting variables 27712 1727096509.70089: in VariableManager get_vars() 27712 1727096509.70102: Calling all_inventory to load vars for managed_node2 27712 1727096509.70104: Calling groups_inventory to load vars for managed_node2 27712 1727096509.70106: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096509.70111: Calling all_plugins_play to load vars for managed_node2 27712 1727096509.70113: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096509.70116: Calling groups_plugins_play to load vars for managed_node2 27712 1727096509.71295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096509.73302: done with get_vars() 27712 1727096509.73322: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Monday 23 September 2024 09:01:49 -0400 (0:00:00.129) 0:00:35.429 ****** 27712 1727096509.73601: entering _queue_task() for managed_node2/include_tasks 27712 1727096509.74157: worker is 1 (out of 1 available) 27712 1727096509.74577: exiting _queue_task() for managed_node2/include_tasks 27712 1727096509.74588: done queuing things up, now waiting for results queue to drain 27712 1727096509.74589: waiting for pending results... 27712 1727096509.75093: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 27712 1727096509.75098: in run() - task 0afff68d-5257-cbc7-8716-000000000990 27712 1727096509.75102: variable 'ansible_search_path' from source: unknown 27712 1727096509.75105: variable 'ansible_search_path' from source: unknown 27712 1727096509.75333: calling self._execute() 27712 1727096509.75338: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096509.75341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096509.75353: variable 'omit' from source: magic vars 27712 1727096509.76156: variable 'ansible_distribution_major_version' from source: facts 27712 1727096509.76170: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096509.76212: _execute() done 27712 1727096509.76216: dumping result to json 27712 1727096509.76218: done dumping result, returning 27712 1727096509.76227: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-cbc7-8716-000000000990] 27712 1727096509.76232: sending task result for task 0afff68d-5257-cbc7-8716-000000000990 27712 1727096509.76345: done sending task result for task 0afff68d-5257-cbc7-8716-000000000990 27712 1727096509.76349: WORKER PROCESS EXITING 27712 1727096509.76384: no more pending results, returning what we have 27712 1727096509.76389: in VariableManager get_vars() 27712 1727096509.76439: Calling all_inventory to load vars for managed_node2 27712 1727096509.76443: Calling groups_inventory to load vars for managed_node2 27712 1727096509.76445: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096509.76460: Calling all_plugins_play to load vars for managed_node2 27712 1727096509.76463: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096509.76466: Calling groups_plugins_play to load vars for managed_node2 27712 1727096509.79616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096509.82125: done with get_vars() 27712 1727096509.82148: variable 'ansible_search_path' from source: unknown 27712 1727096509.82150: variable 'ansible_search_path' from source: unknown 27712 1727096509.82274: we have included files to process 27712 1727096509.82275: generating all_blocks data 27712 1727096509.82277: done generating all_blocks data 27712 1727096509.82278: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27712 1727096509.82279: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27712 1727096509.82283: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 27712 1727096509.82466: done processing included file 27712 1727096509.83074: iterating over new_blocks loaded from include file 27712 1727096509.83077: in VariableManager get_vars() 27712 1727096509.83099: done with get_vars() 27712 1727096509.83102: filtering new block on tags 27712 1727096509.83128: done filtering new block on tags 27712 1727096509.83131: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 27712 1727096509.83136: extending task lists for all hosts with included blocks 27712 1727096509.83261: done extending task lists 27712 1727096509.83262: done processing included files 27712 1727096509.83263: results queue empty 27712 1727096509.83264: checking for any_errors_fatal 27712 1727096509.83267: done checking for any_errors_fatal 27712 1727096509.83269: checking for max_fail_percentage 27712 1727096509.83270: done checking for max_fail_percentage 27712 1727096509.83274: checking to see if all hosts have failed and the running result is not ok 27712 1727096509.83274: done checking to see if all hosts have failed 27712 1727096509.83275: getting the remaining hosts for this loop 27712 1727096509.83276: done getting the remaining hosts for this loop 27712 1727096509.83279: getting the next task for host managed_node2 27712 1727096509.83283: done getting next task for host managed_node2 27712 1727096509.83285: ^ task is: TASK: Get stat for interface {{ interface }} 27712 1727096509.83289: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096509.83291: getting variables 27712 1727096509.83292: in VariableManager get_vars() 27712 1727096509.83305: Calling all_inventory to load vars for managed_node2 27712 1727096509.83307: Calling groups_inventory to load vars for managed_node2 27712 1727096509.83309: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096509.83314: Calling all_plugins_play to load vars for managed_node2 27712 1727096509.83317: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096509.83319: Calling groups_plugins_play to load vars for managed_node2 27712 1727096509.85990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096509.89517: done with get_vars() 27712 1727096509.89541: done getting variables 27712 1727096509.89911: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 09:01:49 -0400 (0:00:00.163) 0:00:35.592 ****** 27712 1727096509.89944: entering _queue_task() for managed_node2/stat 27712 1727096509.90495: worker is 1 (out of 1 available) 27712 1727096509.90505: exiting _queue_task() for managed_node2/stat 27712 1727096509.90516: done queuing things up, now waiting for results queue to drain 27712 1727096509.90518: waiting for pending results... 27712 1727096509.91223: running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 27712 1727096509.91327: in run() - task 0afff68d-5257-cbc7-8716-000000000a4d 27712 1727096509.91347: variable 'ansible_search_path' from source: unknown 27712 1727096509.91356: variable 'ansible_search_path' from source: unknown 27712 1727096509.91428: calling self._execute() 27712 1727096509.91521: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096509.91541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096509.91557: variable 'omit' from source: magic vars 27712 1727096509.91960: variable 'ansible_distribution_major_version' from source: facts 27712 1727096509.91987: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096509.92035: variable 'omit' from source: magic vars 27712 1727096509.92064: variable 'omit' from source: magic vars 27712 1727096509.92174: variable 'interface' from source: set_fact 27712 1727096509.92209: variable 'omit' from source: magic vars 27712 1727096509.92261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096509.92364: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096509.92368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096509.92371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096509.92373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096509.92415: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096509.92423: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096509.92432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096509.92551: Set connection var ansible_connection to ssh 27712 1727096509.92564: Set connection var ansible_pipelining to False 27712 1727096509.92581: Set connection var ansible_timeout to 10 27712 1727096509.92590: Set connection var ansible_shell_type to sh 27712 1727096509.92622: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096509.92626: Set connection var ansible_shell_executable to /bin/sh 27712 1727096509.92672: variable 'ansible_shell_executable' from source: unknown 27712 1727096509.92675: variable 'ansible_connection' from source: unknown 27712 1727096509.92678: variable 'ansible_module_compression' from source: unknown 27712 1727096509.92681: variable 'ansible_shell_type' from source: unknown 27712 1727096509.92687: variable 'ansible_shell_executable' from source: unknown 27712 1727096509.92690: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096509.92693: variable 'ansible_pipelining' from source: unknown 27712 1727096509.92695: variable 'ansible_timeout' from source: unknown 27712 1727096509.92729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096509.92943: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096509.93018: variable 'omit' from source: magic vars 27712 1727096509.93021: starting attempt loop 27712 1727096509.93027: running the handler 27712 1727096509.93030: _low_level_execute_command(): starting 27712 1727096509.93032: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096509.93796: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096509.93828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096509.93903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096509.93925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096509.93957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096509.94114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096509.95733: stdout chunk (state=3): >>>/root <<< 27712 1727096509.95876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096509.95888: stdout chunk (state=3): >>><<< 27712 1727096509.95906: stderr chunk (state=3): >>><<< 27712 1727096509.95979: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096509.95982: _low_level_execute_command(): starting 27712 1727096509.95986: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096509.9593403-29362-241984977536846 `" && echo ansible-tmp-1727096509.9593403-29362-241984977536846="` echo /root/.ansible/tmp/ansible-tmp-1727096509.9593403-29362-241984977536846 `" ) && sleep 0' 27712 1727096509.96602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096509.96703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096509.96747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096509.96764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096509.96789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096509.96862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096509.98838: stdout chunk (state=3): >>>ansible-tmp-1727096509.9593403-29362-241984977536846=/root/.ansible/tmp/ansible-tmp-1727096509.9593403-29362-241984977536846 <<< 27712 1727096509.98974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096509.98978: stdout chunk (state=3): >>><<< 27712 1727096509.98982: stderr chunk (state=3): >>><<< 27712 1727096509.99001: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096509.9593403-29362-241984977536846=/root/.ansible/tmp/ansible-tmp-1727096509.9593403-29362-241984977536846 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096509.99039: variable 'ansible_module_compression' from source: unknown 27712 1727096509.99090: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27712 1727096509.99123: variable 'ansible_facts' from source: unknown 27712 1727096509.99187: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096509.9593403-29362-241984977536846/AnsiballZ_stat.py 27712 1727096509.99293: Sending initial data 27712 1727096509.99298: Sent initial data (153 bytes) 27712 1727096509.99794: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096509.99810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096509.99896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096509.99938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096509.99957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096510.00091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096510.00623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096510.02237: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096510.02272: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096510.02302: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpgdyk24ii /root/.ansible/tmp/ansible-tmp-1727096509.9593403-29362-241984977536846/AnsiballZ_stat.py <<< 27712 1727096510.02306: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096509.9593403-29362-241984977536846/AnsiballZ_stat.py" <<< 27712 1727096510.02332: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpgdyk24ii" to remote "/root/.ansible/tmp/ansible-tmp-1727096509.9593403-29362-241984977536846/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096509.9593403-29362-241984977536846/AnsiballZ_stat.py" <<< 27712 1727096510.02812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096510.02846: stderr chunk (state=3): >>><<< 27712 1727096510.02850: stdout chunk (state=3): >>><<< 27712 1727096510.02867: done transferring module to remote 27712 1727096510.02880: _low_level_execute_command(): starting 27712 1727096510.02885: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096509.9593403-29362-241984977536846/ /root/.ansible/tmp/ansible-tmp-1727096509.9593403-29362-241984977536846/AnsiballZ_stat.py && sleep 0' 27712 1727096510.03336: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096510.03343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096510.03410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096510.03414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096510.03417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096510.03441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096510.03495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096510.03589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096510.05494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096510.05498: stdout chunk (state=3): >>><<< 27712 1727096510.05500: stderr chunk (state=3): >>><<< 27712 1727096510.05553: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096510.05557: _low_level_execute_command(): starting 27712 1727096510.05560: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096509.9593403-29362-241984977536846/AnsiballZ_stat.py && sleep 0' 27712 1727096510.06234: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096510.06238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096510.06240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096510.06243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096510.06245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096510.06251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096510.06292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096510.06307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096510.06349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096510.21914: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27712 1727096510.23357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096510.23361: stdout chunk (state=3): >>><<< 27712 1727096510.23363: stderr chunk (state=3): >>><<< 27712 1727096510.23481: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096510.23485: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096509.9593403-29362-241984977536846/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096510.23487: _low_level_execute_command(): starting 27712 1727096510.23489: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096509.9593403-29362-241984977536846/ > /dev/null 2>&1 && sleep 0' 27712 1727096510.24112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096510.24205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27712 1727096510.24214: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096510.24237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096510.24331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096510.26206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096510.26210: stdout chunk (state=3): >>><<< 27712 1727096510.26214: stderr chunk (state=3): >>><<< 27712 1727096510.26233: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096510.26253: handler run complete 27712 1727096510.26278: attempt loop complete, returning result 27712 1727096510.26281: _execute() done 27712 1727096510.26283: dumping result to json 27712 1727096510.26286: done dumping result, returning 27712 1727096510.26288: done running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 [0afff68d-5257-cbc7-8716-000000000a4d] 27712 1727096510.26290: sending task result for task 0afff68d-5257-cbc7-8716-000000000a4d 27712 1727096510.26420: done sending task result for task 0afff68d-5257-cbc7-8716-000000000a4d 27712 1727096510.26422: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 27712 1727096510.26488: no more pending results, returning what we have 27712 1727096510.26492: results queue empty 27712 1727096510.26493: checking for any_errors_fatal 27712 1727096510.26495: done checking for any_errors_fatal 27712 1727096510.26495: checking for max_fail_percentage 27712 1727096510.26497: done checking for max_fail_percentage 27712 1727096510.26498: checking to see if all hosts have failed and the running result is not ok 27712 1727096510.26498: done checking to see if all hosts have failed 27712 1727096510.26499: getting the remaining hosts for this loop 27712 1727096510.26500: done getting the remaining hosts for this loop 27712 1727096510.26504: getting the next task for host managed_node2 27712 1727096510.26511: done getting next task for host managed_node2 27712 1727096510.26514: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 27712 1727096510.26518: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096510.26523: getting variables 27712 1727096510.26524: in VariableManager get_vars() 27712 1727096510.26576: Calling all_inventory to load vars for managed_node2 27712 1727096510.26579: Calling groups_inventory to load vars for managed_node2 27712 1727096510.26582: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096510.26592: Calling all_plugins_play to load vars for managed_node2 27712 1727096510.26617: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096510.26622: Calling groups_plugins_play to load vars for managed_node2 27712 1727096510.27478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096510.29266: done with get_vars() 27712 1727096510.29295: done getting variables 27712 1727096510.29347: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096510.29439: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Monday 23 September 2024 09:01:50 -0400 (0:00:00.395) 0:00:35.987 ****** 27712 1727096510.29463: entering _queue_task() for managed_node2/assert 27712 1727096510.29723: worker is 1 (out of 1 available) 27712 1727096510.29737: exiting _queue_task() for managed_node2/assert 27712 1727096510.29748: done queuing things up, now waiting for results queue to drain 27712 1727096510.29749: waiting for pending results... 27712 1727096510.30285: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'ethtest0' 27712 1727096510.30333: in run() - task 0afff68d-5257-cbc7-8716-000000000991 27712 1727096510.30346: variable 'ansible_search_path' from source: unknown 27712 1727096510.30349: variable 'ansible_search_path' from source: unknown 27712 1727096510.30391: calling self._execute() 27712 1727096510.30624: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096510.30637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096510.30652: variable 'omit' from source: magic vars 27712 1727096510.31119: variable 'ansible_distribution_major_version' from source: facts 27712 1727096510.31140: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096510.31146: variable 'omit' from source: magic vars 27712 1727096510.31185: variable 'omit' from source: magic vars 27712 1727096510.31251: variable 'interface' from source: set_fact 27712 1727096510.31265: variable 'omit' from source: magic vars 27712 1727096510.31304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096510.31331: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096510.31345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096510.31358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096510.31369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096510.31399: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096510.31403: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096510.31412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096510.31483: Set connection var ansible_connection to ssh 27712 1727096510.31493: Set connection var ansible_pipelining to False 27712 1727096510.31496: Set connection var ansible_timeout to 10 27712 1727096510.31499: Set connection var ansible_shell_type to sh 27712 1727096510.31503: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096510.31509: Set connection var ansible_shell_executable to /bin/sh 27712 1727096510.31526: variable 'ansible_shell_executable' from source: unknown 27712 1727096510.31529: variable 'ansible_connection' from source: unknown 27712 1727096510.31532: variable 'ansible_module_compression' from source: unknown 27712 1727096510.31534: variable 'ansible_shell_type' from source: unknown 27712 1727096510.31536: variable 'ansible_shell_executable' from source: unknown 27712 1727096510.31538: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096510.31541: variable 'ansible_pipelining' from source: unknown 27712 1727096510.31544: variable 'ansible_timeout' from source: unknown 27712 1727096510.31550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096510.31652: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096510.31661: variable 'omit' from source: magic vars 27712 1727096510.31664: starting attempt loop 27712 1727096510.31666: running the handler 27712 1727096510.31772: variable 'interface_stat' from source: set_fact 27712 1727096510.31784: Evaluated conditional (not interface_stat.stat.exists): True 27712 1727096510.31789: handler run complete 27712 1727096510.31800: attempt loop complete, returning result 27712 1727096510.31803: _execute() done 27712 1727096510.31806: dumping result to json 27712 1727096510.31808: done dumping result, returning 27712 1727096510.31820: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'ethtest0' [0afff68d-5257-cbc7-8716-000000000991] 27712 1727096510.31825: sending task result for task 0afff68d-5257-cbc7-8716-000000000991 27712 1727096510.31903: done sending task result for task 0afff68d-5257-cbc7-8716-000000000991 27712 1727096510.31905: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27712 1727096510.31951: no more pending results, returning what we have 27712 1727096510.31955: results queue empty 27712 1727096510.31956: checking for any_errors_fatal 27712 1727096510.31971: done checking for any_errors_fatal 27712 1727096510.31972: checking for max_fail_percentage 27712 1727096510.31974: done checking for max_fail_percentage 27712 1727096510.31975: checking to see if all hosts have failed and the running result is not ok 27712 1727096510.31975: done checking to see if all hosts have failed 27712 1727096510.31976: getting the remaining hosts for this loop 27712 1727096510.31977: done getting the remaining hosts for this loop 27712 1727096510.31981: getting the next task for host managed_node2 27712 1727096510.31988: done getting next task for host managed_node2 27712 1727096510.31992: ^ task is: TASK: Assert interface0 profile and interface1 profile are absent 27712 1727096510.31995: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096510.32000: getting variables 27712 1727096510.32002: in VariableManager get_vars() 27712 1727096510.32041: Calling all_inventory to load vars for managed_node2 27712 1727096510.32044: Calling groups_inventory to load vars for managed_node2 27712 1727096510.32046: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096510.32055: Calling all_plugins_play to load vars for managed_node2 27712 1727096510.32058: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096510.32060: Calling groups_plugins_play to load vars for managed_node2 27712 1727096510.33011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096510.33965: done with get_vars() 27712 1727096510.33990: done getting variables TASK [Assert interface0 profile and interface1 profile are absent] ************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:162 Monday 23 September 2024 09:01:50 -0400 (0:00:00.046) 0:00:36.034 ****** 27712 1727096510.34070: entering _queue_task() for managed_node2/include_tasks 27712 1727096510.34400: worker is 1 (out of 1 available) 27712 1727096510.34411: exiting _queue_task() for managed_node2/include_tasks 27712 1727096510.34422: done queuing things up, now waiting for results queue to drain 27712 1727096510.34423: waiting for pending results... 27712 1727096510.34831: running TaskExecutor() for managed_node2/TASK: Assert interface0 profile and interface1 profile are absent 27712 1727096510.34837: in run() - task 0afff68d-5257-cbc7-8716-0000000000ba 27712 1727096510.34841: variable 'ansible_search_path' from source: unknown 27712 1727096510.34874: variable 'interface0' from source: play vars 27712 1727096510.35070: variable 'interface0' from source: play vars 27712 1727096510.35146: variable 'interface1' from source: play vars 27712 1727096510.35150: variable 'interface1' from source: play vars 27712 1727096510.35163: variable 'omit' from source: magic vars 27712 1727096510.35308: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096510.35317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096510.35334: variable 'omit' from source: magic vars 27712 1727096510.35561: variable 'ansible_distribution_major_version' from source: facts 27712 1727096510.35572: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096510.35608: variable 'item' from source: unknown 27712 1727096510.35692: variable 'item' from source: unknown 27712 1727096510.35895: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096510.35899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096510.36021: variable 'omit' from source: magic vars 27712 1727096510.36025: variable 'ansible_distribution_major_version' from source: facts 27712 1727096510.36028: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096510.36031: variable 'item' from source: unknown 27712 1727096510.36085: variable 'item' from source: unknown 27712 1727096510.36210: dumping result to json 27712 1727096510.36214: done dumping result, returning 27712 1727096510.36217: done running TaskExecutor() for managed_node2/TASK: Assert interface0 profile and interface1 profile are absent [0afff68d-5257-cbc7-8716-0000000000ba] 27712 1727096510.36219: sending task result for task 0afff68d-5257-cbc7-8716-0000000000ba 27712 1727096510.36286: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000ba 27712 1727096510.36290: WORKER PROCESS EXITING 27712 1727096510.36318: no more pending results, returning what we have 27712 1727096510.36324: in VariableManager get_vars() 27712 1727096510.36376: Calling all_inventory to load vars for managed_node2 27712 1727096510.36380: Calling groups_inventory to load vars for managed_node2 27712 1727096510.36382: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096510.36396: Calling all_plugins_play to load vars for managed_node2 27712 1727096510.36400: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096510.36403: Calling groups_plugins_play to load vars for managed_node2 27712 1727096510.37837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096510.39341: done with get_vars() 27712 1727096510.39369: variable 'ansible_search_path' from source: unknown 27712 1727096510.39389: variable 'ansible_search_path' from source: unknown 27712 1727096510.39396: we have included files to process 27712 1727096510.39397: generating all_blocks data 27712 1727096510.39399: done generating all_blocks data 27712 1727096510.39402: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27712 1727096510.39403: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27712 1727096510.39406: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27712 1727096510.39563: in VariableManager get_vars() 27712 1727096510.39594: done with get_vars() 27712 1727096510.39704: done processing included file 27712 1727096510.39706: iterating over new_blocks loaded from include file 27712 1727096510.39708: in VariableManager get_vars() 27712 1727096510.39725: done with get_vars() 27712 1727096510.39727: filtering new block on tags 27712 1727096510.39758: done filtering new block on tags 27712 1727096510.39761: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 => (item=ethtest0) 27712 1727096510.39766: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27712 1727096510.39769: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27712 1727096510.39774: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 27712 1727096510.39846: in VariableManager get_vars() 27712 1727096510.39869: done with get_vars() 27712 1727096510.39950: done processing included file 27712 1727096510.39952: iterating over new_blocks loaded from include file 27712 1727096510.39954: in VariableManager get_vars() 27712 1727096510.39978: done with get_vars() 27712 1727096510.39980: filtering new block on tags 27712 1727096510.40013: done filtering new block on tags 27712 1727096510.40015: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 => (item=ethtest1) 27712 1727096510.40019: extending task lists for all hosts with included blocks 27712 1727096510.41706: done extending task lists 27712 1727096510.41708: done processing included files 27712 1727096510.41709: results queue empty 27712 1727096510.41709: checking for any_errors_fatal 27712 1727096510.41713: done checking for any_errors_fatal 27712 1727096510.41713: checking for max_fail_percentage 27712 1727096510.41714: done checking for max_fail_percentage 27712 1727096510.41715: checking to see if all hosts have failed and the running result is not ok 27712 1727096510.41715: done checking to see if all hosts have failed 27712 1727096510.41716: getting the remaining hosts for this loop 27712 1727096510.41717: done getting the remaining hosts for this loop 27712 1727096510.41719: getting the next task for host managed_node2 27712 1727096510.41723: done getting next task for host managed_node2 27712 1727096510.41725: ^ task is: TASK: Include the task 'get_profile_stat.yml' 27712 1727096510.41727: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096510.41730: getting variables 27712 1727096510.41731: in VariableManager get_vars() 27712 1727096510.41746: Calling all_inventory to load vars for managed_node2 27712 1727096510.41748: Calling groups_inventory to load vars for managed_node2 27712 1727096510.41755: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096510.41760: Calling all_plugins_play to load vars for managed_node2 27712 1727096510.41762: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096510.41764: Calling groups_plugins_play to load vars for managed_node2 27712 1727096510.47318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096510.48853: done with get_vars() 27712 1727096510.48891: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Monday 23 September 2024 09:01:50 -0400 (0:00:00.149) 0:00:36.183 ****** 27712 1727096510.48976: entering _queue_task() for managed_node2/include_tasks 27712 1727096510.49338: worker is 1 (out of 1 available) 27712 1727096510.49349: exiting _queue_task() for managed_node2/include_tasks 27712 1727096510.49359: done queuing things up, now waiting for results queue to drain 27712 1727096510.49361: waiting for pending results... 27712 1727096510.49979: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 27712 1727096510.50139: in run() - task 0afff68d-5257-cbc7-8716-000000000a6c 27712 1727096510.50203: variable 'ansible_search_path' from source: unknown 27712 1727096510.50305: variable 'ansible_search_path' from source: unknown 27712 1727096510.50311: calling self._execute() 27712 1727096510.50381: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096510.50395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096510.50417: variable 'omit' from source: magic vars 27712 1727096510.50814: variable 'ansible_distribution_major_version' from source: facts 27712 1727096510.50833: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096510.50854: _execute() done 27712 1727096510.50863: dumping result to json 27712 1727096510.50874: done dumping result, returning 27712 1727096510.50888: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-cbc7-8716-000000000a6c] 27712 1727096510.50899: sending task result for task 0afff68d-5257-cbc7-8716-000000000a6c 27712 1727096510.51034: done sending task result for task 0afff68d-5257-cbc7-8716-000000000a6c 27712 1727096510.51039: WORKER PROCESS EXITING 27712 1727096510.51076: no more pending results, returning what we have 27712 1727096510.51081: in VariableManager get_vars() 27712 1727096510.51131: Calling all_inventory to load vars for managed_node2 27712 1727096510.51134: Calling groups_inventory to load vars for managed_node2 27712 1727096510.51136: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096510.51149: Calling all_plugins_play to load vars for managed_node2 27712 1727096510.51151: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096510.51153: Calling groups_plugins_play to load vars for managed_node2 27712 1727096510.54010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096510.57563: done with get_vars() 27712 1727096510.57586: variable 'ansible_search_path' from source: unknown 27712 1727096510.57588: variable 'ansible_search_path' from source: unknown 27712 1727096510.57630: we have included files to process 27712 1727096510.57632: generating all_blocks data 27712 1727096510.57634: done generating all_blocks data 27712 1727096510.57635: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27712 1727096510.57636: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27712 1727096510.57638: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27712 1727096510.58893: done processing included file 27712 1727096510.58896: iterating over new_blocks loaded from include file 27712 1727096510.58897: in VariableManager get_vars() 27712 1727096510.58928: done with get_vars() 27712 1727096510.58930: filtering new block on tags 27712 1727096510.59091: done filtering new block on tags 27712 1727096510.59095: in VariableManager get_vars() 27712 1727096510.59115: done with get_vars() 27712 1727096510.59116: filtering new block on tags 27712 1727096510.59186: done filtering new block on tags 27712 1727096510.59189: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 27712 1727096510.59194: extending task lists for all hosts with included blocks 27712 1727096510.59340: done extending task lists 27712 1727096510.59342: done processing included files 27712 1727096510.59343: results queue empty 27712 1727096510.59344: checking for any_errors_fatal 27712 1727096510.59349: done checking for any_errors_fatal 27712 1727096510.59349: checking for max_fail_percentage 27712 1727096510.59350: done checking for max_fail_percentage 27712 1727096510.59351: checking to see if all hosts have failed and the running result is not ok 27712 1727096510.59352: done checking to see if all hosts have failed 27712 1727096510.59353: getting the remaining hosts for this loop 27712 1727096510.59354: done getting the remaining hosts for this loop 27712 1727096510.59356: getting the next task for host managed_node2 27712 1727096510.59361: done getting next task for host managed_node2 27712 1727096510.59363: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 27712 1727096510.59366: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096510.59370: getting variables 27712 1727096510.59371: in VariableManager get_vars() 27712 1727096510.59385: Calling all_inventory to load vars for managed_node2 27712 1727096510.59387: Calling groups_inventory to load vars for managed_node2 27712 1727096510.59390: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096510.59400: Calling all_plugins_play to load vars for managed_node2 27712 1727096510.59403: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096510.59406: Calling groups_plugins_play to load vars for managed_node2 27712 1727096510.60616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096510.62812: done with get_vars() 27712 1727096510.62834: done getting variables 27712 1727096510.62883: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 09:01:50 -0400 (0:00:00.139) 0:00:36.322 ****** 27712 1727096510.62921: entering _queue_task() for managed_node2/set_fact 27712 1727096510.63408: worker is 1 (out of 1 available) 27712 1727096510.63418: exiting _queue_task() for managed_node2/set_fact 27712 1727096510.63429: done queuing things up, now waiting for results queue to drain 27712 1727096510.63431: waiting for pending results... 27712 1727096510.63671: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 27712 1727096510.63840: in run() - task 0afff68d-5257-cbc7-8716-000000000b3c 27712 1727096510.63844: variable 'ansible_search_path' from source: unknown 27712 1727096510.63846: variable 'ansible_search_path' from source: unknown 27712 1727096510.63850: calling self._execute() 27712 1727096510.63971: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096510.63991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096510.64007: variable 'omit' from source: magic vars 27712 1727096510.64463: variable 'ansible_distribution_major_version' from source: facts 27712 1727096510.64494: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096510.64528: variable 'omit' from source: magic vars 27712 1727096510.64573: variable 'omit' from source: magic vars 27712 1727096510.64621: variable 'omit' from source: magic vars 27712 1727096510.64671: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096510.64745: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096510.64749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096510.64774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096510.64791: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096510.64854: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096510.64858: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096510.64860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096510.64965: Set connection var ansible_connection to ssh 27712 1727096510.65040: Set connection var ansible_pipelining to False 27712 1727096510.65043: Set connection var ansible_timeout to 10 27712 1727096510.65046: Set connection var ansible_shell_type to sh 27712 1727096510.65049: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096510.65051: Set connection var ansible_shell_executable to /bin/sh 27712 1727096510.65054: variable 'ansible_shell_executable' from source: unknown 27712 1727096510.65056: variable 'ansible_connection' from source: unknown 27712 1727096510.65059: variable 'ansible_module_compression' from source: unknown 27712 1727096510.65072: variable 'ansible_shell_type' from source: unknown 27712 1727096510.65082: variable 'ansible_shell_executable' from source: unknown 27712 1727096510.65089: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096510.65098: variable 'ansible_pipelining' from source: unknown 27712 1727096510.65104: variable 'ansible_timeout' from source: unknown 27712 1727096510.65111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096510.65276: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096510.65365: variable 'omit' from source: magic vars 27712 1727096510.65370: starting attempt loop 27712 1727096510.65372: running the handler 27712 1727096510.65374: handler run complete 27712 1727096510.65377: attempt loop complete, returning result 27712 1727096510.65379: _execute() done 27712 1727096510.65381: dumping result to json 27712 1727096510.65383: done dumping result, returning 27712 1727096510.65386: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-cbc7-8716-000000000b3c] 27712 1727096510.65388: sending task result for task 0afff68d-5257-cbc7-8716-000000000b3c ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 27712 1727096510.65615: no more pending results, returning what we have 27712 1727096510.65619: results queue empty 27712 1727096510.65620: checking for any_errors_fatal 27712 1727096510.65622: done checking for any_errors_fatal 27712 1727096510.65623: checking for max_fail_percentage 27712 1727096510.65624: done checking for max_fail_percentage 27712 1727096510.65625: checking to see if all hosts have failed and the running result is not ok 27712 1727096510.65626: done checking to see if all hosts have failed 27712 1727096510.65626: getting the remaining hosts for this loop 27712 1727096510.65628: done getting the remaining hosts for this loop 27712 1727096510.65632: getting the next task for host managed_node2 27712 1727096510.65641: done getting next task for host managed_node2 27712 1727096510.65643: ^ task is: TASK: Stat profile file 27712 1727096510.65649: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096510.65653: getting variables 27712 1727096510.65655: in VariableManager get_vars() 27712 1727096510.65699: Calling all_inventory to load vars for managed_node2 27712 1727096510.65702: Calling groups_inventory to load vars for managed_node2 27712 1727096510.65705: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096510.65717: Calling all_plugins_play to load vars for managed_node2 27712 1727096510.65720: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096510.65723: Calling groups_plugins_play to load vars for managed_node2 27712 1727096510.66396: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b3c 27712 1727096510.66399: WORKER PROCESS EXITING 27712 1727096510.67552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096510.70142: done with get_vars() 27712 1727096510.70575: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 09:01:50 -0400 (0:00:00.077) 0:00:36.400 ****** 27712 1727096510.70680: entering _queue_task() for managed_node2/stat 27712 1727096510.71438: worker is 1 (out of 1 available) 27712 1727096510.71453: exiting _queue_task() for managed_node2/stat 27712 1727096510.71469: done queuing things up, now waiting for results queue to drain 27712 1727096510.71471: waiting for pending results... 27712 1727096510.71964: running TaskExecutor() for managed_node2/TASK: Stat profile file 27712 1727096510.72257: in run() - task 0afff68d-5257-cbc7-8716-000000000b3d 27712 1727096510.72276: variable 'ansible_search_path' from source: unknown 27712 1727096510.72280: variable 'ansible_search_path' from source: unknown 27712 1727096510.72407: calling self._execute() 27712 1727096510.72656: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096510.72662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096510.72681: variable 'omit' from source: magic vars 27712 1727096510.73455: variable 'ansible_distribution_major_version' from source: facts 27712 1727096510.73466: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096510.73477: variable 'omit' from source: magic vars 27712 1727096510.73688: variable 'omit' from source: magic vars 27712 1727096510.73945: variable 'profile' from source: include params 27712 1727096510.73949: variable 'item' from source: include params 27712 1727096510.74030: variable 'item' from source: include params 27712 1727096510.74050: variable 'omit' from source: magic vars 27712 1727096510.74221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096510.74260: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096510.74284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096510.74444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096510.74447: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096510.74514: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096510.74519: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096510.74522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096510.74687: Set connection var ansible_connection to ssh 27712 1727096510.74695: Set connection var ansible_pipelining to False 27712 1727096510.74774: Set connection var ansible_timeout to 10 27712 1727096510.74781: Set connection var ansible_shell_type to sh 27712 1727096510.74790: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096510.74795: Set connection var ansible_shell_executable to /bin/sh 27712 1727096510.74891: variable 'ansible_shell_executable' from source: unknown 27712 1727096510.74894: variable 'ansible_connection' from source: unknown 27712 1727096510.74897: variable 'ansible_module_compression' from source: unknown 27712 1727096510.74900: variable 'ansible_shell_type' from source: unknown 27712 1727096510.74902: variable 'ansible_shell_executable' from source: unknown 27712 1727096510.74904: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096510.74907: variable 'ansible_pipelining' from source: unknown 27712 1727096510.74910: variable 'ansible_timeout' from source: unknown 27712 1727096510.74913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096510.75380: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096510.75414: variable 'omit' from source: magic vars 27712 1727096510.75417: starting attempt loop 27712 1727096510.75424: running the handler 27712 1727096510.75427: _low_level_execute_command(): starting 27712 1727096510.75429: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096510.76182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096510.76200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096510.76291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096510.76305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096510.76320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096510.76344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096510.76530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096510.78225: stdout chunk (state=3): >>>/root <<< 27712 1727096510.78374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096510.78377: stdout chunk (state=3): >>><<< 27712 1727096510.78380: stderr chunk (state=3): >>><<< 27712 1727096510.78500: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096510.78503: _low_level_execute_command(): starting 27712 1727096510.78506: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096510.784052-29408-133291501335004 `" && echo ansible-tmp-1727096510.784052-29408-133291501335004="` echo /root/.ansible/tmp/ansible-tmp-1727096510.784052-29408-133291501335004 `" ) && sleep 0' 27712 1727096510.79031: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096510.79045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096510.79087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27712 1727096510.79114: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096510.79200: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096510.79246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096510.79286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096510.81373: stdout chunk (state=3): >>>ansible-tmp-1727096510.784052-29408-133291501335004=/root/.ansible/tmp/ansible-tmp-1727096510.784052-29408-133291501335004 <<< 27712 1727096510.81453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096510.81464: stdout chunk (state=3): >>><<< 27712 1727096510.81479: stderr chunk (state=3): >>><<< 27712 1727096510.81500: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096510.784052-29408-133291501335004=/root/.ansible/tmp/ansible-tmp-1727096510.784052-29408-133291501335004 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096510.81562: variable 'ansible_module_compression' from source: unknown 27712 1727096510.81879: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27712 1727096510.81882: variable 'ansible_facts' from source: unknown 27712 1727096510.81979: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096510.784052-29408-133291501335004/AnsiballZ_stat.py 27712 1727096510.82521: Sending initial data 27712 1727096510.82540: Sent initial data (152 bytes) 27712 1727096510.83750: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096510.83888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096510.83905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096510.83926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096510.83993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096510.85634: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096510.85692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096510.85853: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmp3wy4jqgy /root/.ansible/tmp/ansible-tmp-1727096510.784052-29408-133291501335004/AnsiballZ_stat.py <<< 27712 1727096510.85863: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096510.784052-29408-133291501335004/AnsiballZ_stat.py" <<< 27712 1727096510.85866: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmp3wy4jqgy" to remote "/root/.ansible/tmp/ansible-tmp-1727096510.784052-29408-133291501335004/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096510.784052-29408-133291501335004/AnsiballZ_stat.py" <<< 27712 1727096510.87283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096510.87287: stdout chunk (state=3): >>><<< 27712 1727096510.87289: stderr chunk (state=3): >>><<< 27712 1727096510.87291: done transferring module to remote 27712 1727096510.87293: _low_level_execute_command(): starting 27712 1727096510.87295: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096510.784052-29408-133291501335004/ /root/.ansible/tmp/ansible-tmp-1727096510.784052-29408-133291501335004/AnsiballZ_stat.py && sleep 0' 27712 1727096510.88940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096510.88944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096510.88946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096510.89290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096510.89589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096510.91381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096510.91385: stdout chunk (state=3): >>><<< 27712 1727096510.91388: stderr chunk (state=3): >>><<< 27712 1727096510.91390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096510.91393: _low_level_execute_command(): starting 27712 1727096510.91396: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096510.784052-29408-133291501335004/AnsiballZ_stat.py && sleep 0' 27712 1727096510.92774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096510.92779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096510.92781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096510.92783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096510.92832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096511.08375: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27712 1727096511.09793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096511.09797: stdout chunk (state=3): >>><<< 27712 1727096511.09803: stderr chunk (state=3): >>><<< 27712 1727096511.09825: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096511.09854: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096510.784052-29408-133291501335004/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096511.09863: _low_level_execute_command(): starting 27712 1727096511.09870: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096510.784052-29408-133291501335004/ > /dev/null 2>&1 && sleep 0' 27712 1727096511.10657: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096511.10676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096511.10696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096511.10748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096511.12594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096511.12651: stderr chunk (state=3): >>><<< 27712 1727096511.12679: stdout chunk (state=3): >>><<< 27712 1727096511.12878: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096511.12885: handler run complete 27712 1727096511.12887: attempt loop complete, returning result 27712 1727096511.12889: _execute() done 27712 1727096511.12890: dumping result to json 27712 1727096511.12892: done dumping result, returning 27712 1727096511.12893: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0afff68d-5257-cbc7-8716-000000000b3d] 27712 1727096511.12895: sending task result for task 0afff68d-5257-cbc7-8716-000000000b3d 27712 1727096511.12962: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b3d 27712 1727096511.12965: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 27712 1727096511.13038: no more pending results, returning what we have 27712 1727096511.13042: results queue empty 27712 1727096511.13044: checking for any_errors_fatal 27712 1727096511.13052: done checking for any_errors_fatal 27712 1727096511.13053: checking for max_fail_percentage 27712 1727096511.13055: done checking for max_fail_percentage 27712 1727096511.13056: checking to see if all hosts have failed and the running result is not ok 27712 1727096511.13057: done checking to see if all hosts have failed 27712 1727096511.13058: getting the remaining hosts for this loop 27712 1727096511.13060: done getting the remaining hosts for this loop 27712 1727096511.13064: getting the next task for host managed_node2 27712 1727096511.13112: done getting next task for host managed_node2 27712 1727096511.13116: ^ task is: TASK: Set NM profile exist flag based on the profile files 27712 1727096511.13123: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096511.13128: getting variables 27712 1727096511.13129: in VariableManager get_vars() 27712 1727096511.13397: Calling all_inventory to load vars for managed_node2 27712 1727096511.13400: Calling groups_inventory to load vars for managed_node2 27712 1727096511.13403: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096511.13415: Calling all_plugins_play to load vars for managed_node2 27712 1727096511.13418: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096511.13421: Calling groups_plugins_play to load vars for managed_node2 27712 1727096511.16163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096511.17950: done with get_vars() 27712 1727096511.17982: done getting variables 27712 1727096511.18041: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 09:01:51 -0400 (0:00:00.473) 0:00:36.874 ****** 27712 1727096511.18084: entering _queue_task() for managed_node2/set_fact 27712 1727096511.18493: worker is 1 (out of 1 available) 27712 1727096511.18505: exiting _queue_task() for managed_node2/set_fact 27712 1727096511.18629: done queuing things up, now waiting for results queue to drain 27712 1727096511.18631: waiting for pending results... 27712 1727096511.18982: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 27712 1727096511.18986: in run() - task 0afff68d-5257-cbc7-8716-000000000b3e 27712 1727096511.18990: variable 'ansible_search_path' from source: unknown 27712 1727096511.18993: variable 'ansible_search_path' from source: unknown 27712 1727096511.18995: calling self._execute() 27712 1727096511.19080: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.19090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.19101: variable 'omit' from source: magic vars 27712 1727096511.19518: variable 'ansible_distribution_major_version' from source: facts 27712 1727096511.19533: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096511.19656: variable 'profile_stat' from source: set_fact 27712 1727096511.19675: Evaluated conditional (profile_stat.stat.exists): False 27712 1727096511.19730: when evaluation is False, skipping this task 27712 1727096511.19733: _execute() done 27712 1727096511.19735: dumping result to json 27712 1727096511.19738: done dumping result, returning 27712 1727096511.19740: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-cbc7-8716-000000000b3e] 27712 1727096511.19742: sending task result for task 0afff68d-5257-cbc7-8716-000000000b3e skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27712 1727096511.19979: no more pending results, returning what we have 27712 1727096511.19984: results queue empty 27712 1727096511.19985: checking for any_errors_fatal 27712 1727096511.19996: done checking for any_errors_fatal 27712 1727096511.19997: checking for max_fail_percentage 27712 1727096511.19998: done checking for max_fail_percentage 27712 1727096511.19999: checking to see if all hosts have failed and the running result is not ok 27712 1727096511.20000: done checking to see if all hosts have failed 27712 1727096511.20001: getting the remaining hosts for this loop 27712 1727096511.20002: done getting the remaining hosts for this loop 27712 1727096511.20006: getting the next task for host managed_node2 27712 1727096511.20012: done getting next task for host managed_node2 27712 1727096511.20015: ^ task is: TASK: Get NM profile info 27712 1727096511.20021: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096511.20025: getting variables 27712 1727096511.20027: in VariableManager get_vars() 27712 1727096511.20176: Calling all_inventory to load vars for managed_node2 27712 1727096511.20185: Calling groups_inventory to load vars for managed_node2 27712 1727096511.20188: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096511.20206: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b3e 27712 1727096511.20209: WORKER PROCESS EXITING 27712 1727096511.20220: Calling all_plugins_play to load vars for managed_node2 27712 1727096511.20224: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096511.20227: Calling groups_plugins_play to load vars for managed_node2 27712 1727096511.21963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096511.23651: done with get_vars() 27712 1727096511.23679: done getting variables 27712 1727096511.23782: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 09:01:51 -0400 (0:00:00.057) 0:00:36.931 ****** 27712 1727096511.23813: entering _queue_task() for managed_node2/shell 27712 1727096511.23815: Creating lock for shell 27712 1727096511.24299: worker is 1 (out of 1 available) 27712 1727096511.24310: exiting _queue_task() for managed_node2/shell 27712 1727096511.24322: done queuing things up, now waiting for results queue to drain 27712 1727096511.24324: waiting for pending results... 27712 1727096511.24491: running TaskExecutor() for managed_node2/TASK: Get NM profile info 27712 1727096511.24602: in run() - task 0afff68d-5257-cbc7-8716-000000000b3f 27712 1727096511.24616: variable 'ansible_search_path' from source: unknown 27712 1727096511.24619: variable 'ansible_search_path' from source: unknown 27712 1727096511.24662: calling self._execute() 27712 1727096511.24778: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.24824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.24829: variable 'omit' from source: magic vars 27712 1727096511.25248: variable 'ansible_distribution_major_version' from source: facts 27712 1727096511.25263: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096511.25296: variable 'omit' from source: magic vars 27712 1727096511.25374: variable 'omit' from source: magic vars 27712 1727096511.25437: variable 'profile' from source: include params 27712 1727096511.25441: variable 'item' from source: include params 27712 1727096511.25533: variable 'item' from source: include params 27712 1727096511.25569: variable 'omit' from source: magic vars 27712 1727096511.25682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096511.25687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096511.25696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096511.25700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096511.25702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096511.25754: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096511.25757: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.25760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.25874: Set connection var ansible_connection to ssh 27712 1727096511.25877: Set connection var ansible_pipelining to False 27712 1727096511.25885: Set connection var ansible_timeout to 10 27712 1727096511.25955: Set connection var ansible_shell_type to sh 27712 1727096511.25958: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096511.25961: Set connection var ansible_shell_executable to /bin/sh 27712 1727096511.25964: variable 'ansible_shell_executable' from source: unknown 27712 1727096511.26032: variable 'ansible_connection' from source: unknown 27712 1727096511.26035: variable 'ansible_module_compression' from source: unknown 27712 1727096511.26038: variable 'ansible_shell_type' from source: unknown 27712 1727096511.26040: variable 'ansible_shell_executable' from source: unknown 27712 1727096511.26042: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.26044: variable 'ansible_pipelining' from source: unknown 27712 1727096511.26046: variable 'ansible_timeout' from source: unknown 27712 1727096511.26050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.26201: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096511.26205: variable 'omit' from source: magic vars 27712 1727096511.26207: starting attempt loop 27712 1727096511.26209: running the handler 27712 1727096511.26212: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096511.26214: _low_level_execute_command(): starting 27712 1727096511.26216: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096511.27055: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096511.27096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096511.27129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096511.27136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096511.27195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096511.28882: stdout chunk (state=3): >>>/root <<< 27712 1727096511.29173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096511.29177: stdout chunk (state=3): >>><<< 27712 1727096511.29179: stderr chunk (state=3): >>><<< 27712 1727096511.29183: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096511.29185: _low_level_execute_command(): starting 27712 1727096511.29188: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096511.291333-29438-202235528709386 `" && echo ansible-tmp-1727096511.291333-29438-202235528709386="` echo /root/.ansible/tmp/ansible-tmp-1727096511.291333-29438-202235528709386 `" ) && sleep 0' 27712 1727096511.30157: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096511.30171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096511.30231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27712 1727096511.30239: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096511.30343: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096511.30357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096511.30410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096511.30512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096511.30545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096511.32680: stdout chunk (state=3): >>>ansible-tmp-1727096511.291333-29438-202235528709386=/root/.ansible/tmp/ansible-tmp-1727096511.291333-29438-202235528709386 <<< 27712 1727096511.32874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096511.32879: stdout chunk (state=3): >>><<< 27712 1727096511.32881: stderr chunk (state=3): >>><<< 27712 1727096511.32884: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096511.291333-29438-202235528709386=/root/.ansible/tmp/ansible-tmp-1727096511.291333-29438-202235528709386 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096511.32886: variable 'ansible_module_compression' from source: unknown 27712 1727096511.32888: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096511.32913: variable 'ansible_facts' from source: unknown 27712 1727096511.32986: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096511.291333-29438-202235528709386/AnsiballZ_command.py 27712 1727096511.33218: Sending initial data 27712 1727096511.33222: Sent initial data (155 bytes) 27712 1727096511.33712: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096511.33722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096511.33732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096511.33745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096511.33896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096511.33904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096511.33940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096511.35586: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096511.35629: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096511.35681: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmp5jnysjx9 /root/.ansible/tmp/ansible-tmp-1727096511.291333-29438-202235528709386/AnsiballZ_command.py <<< 27712 1727096511.35690: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096511.291333-29438-202235528709386/AnsiballZ_command.py" <<< 27712 1727096511.35718: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmp5jnysjx9" to remote "/root/.ansible/tmp/ansible-tmp-1727096511.291333-29438-202235528709386/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096511.291333-29438-202235528709386/AnsiballZ_command.py" <<< 27712 1727096511.36483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096511.36524: stderr chunk (state=3): >>><<< 27712 1727096511.36528: stdout chunk (state=3): >>><<< 27712 1727096511.36543: done transferring module to remote 27712 1727096511.36559: _low_level_execute_command(): starting 27712 1727096511.36634: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096511.291333-29438-202235528709386/ /root/.ansible/tmp/ansible-tmp-1727096511.291333-29438-202235528709386/AnsiballZ_command.py && sleep 0' 27712 1727096511.37719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096511.37877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096511.38021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096511.38137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096511.38143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096511.39962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096511.39980: stderr chunk (state=3): >>><<< 27712 1727096511.39984: stdout chunk (state=3): >>><<< 27712 1727096511.40021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096511.40025: _low_level_execute_command(): starting 27712 1727096511.40030: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096511.291333-29438-202235528709386/AnsiballZ_command.py && sleep 0' 27712 1727096511.40562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096511.40566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096511.40571: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096511.40573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096511.40627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096511.40633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096511.40638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096511.40708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096511.58166: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-23 09:01:51.564528", "end": "2024-09-23 09:01:51.580620", "delta": "0:00:00.016092", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096511.59783: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.126 closed. <<< 27712 1727096511.59827: stderr chunk (state=3): >>><<< 27712 1727096511.59830: stdout chunk (state=3): >>><<< 27712 1727096511.59851: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-23 09:01:51.564528", "end": "2024-09-23 09:01:51.580620", "delta": "0:00:00.016092", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.126 closed. 27712 1727096511.59884: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096511.291333-29438-202235528709386/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096511.59891: _low_level_execute_command(): starting 27712 1727096511.59896: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096511.291333-29438-202235528709386/ > /dev/null 2>&1 && sleep 0' 27712 1727096511.60514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096511.60517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096511.60527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096511.60537: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096511.60551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096511.60564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096511.60601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096511.62454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096511.62485: stderr chunk (state=3): >>><<< 27712 1727096511.62489: stdout chunk (state=3): >>><<< 27712 1727096511.62505: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096511.62510: handler run complete 27712 1727096511.62528: Evaluated conditional (False): False 27712 1727096511.62551: attempt loop complete, returning result 27712 1727096511.62555: _execute() done 27712 1727096511.62557: dumping result to json 27712 1727096511.62560: done dumping result, returning 27712 1727096511.62565: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0afff68d-5257-cbc7-8716-000000000b3f] 27712 1727096511.62576: sending task result for task 0afff68d-5257-cbc7-8716-000000000b3f 27712 1727096511.62689: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b3f 27712 1727096511.62692: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.016092", "end": "2024-09-23 09:01:51.580620", "rc": 1, "start": "2024-09-23 09:01:51.564528" } MSG: non-zero return code ...ignoring 27712 1727096511.62798: no more pending results, returning what we have 27712 1727096511.62830: results queue empty 27712 1727096511.62832: checking for any_errors_fatal 27712 1727096511.62838: done checking for any_errors_fatal 27712 1727096511.62839: checking for max_fail_percentage 27712 1727096511.62841: done checking for max_fail_percentage 27712 1727096511.62842: checking to see if all hosts have failed and the running result is not ok 27712 1727096511.62843: done checking to see if all hosts have failed 27712 1727096511.62843: getting the remaining hosts for this loop 27712 1727096511.62845: done getting the remaining hosts for this loop 27712 1727096511.62848: getting the next task for host managed_node2 27712 1727096511.62854: done getting next task for host managed_node2 27712 1727096511.62857: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 27712 1727096511.62862: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096511.62866: getting variables 27712 1727096511.62869: in VariableManager get_vars() 27712 1727096511.62906: Calling all_inventory to load vars for managed_node2 27712 1727096511.62909: Calling groups_inventory to load vars for managed_node2 27712 1727096511.62910: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096511.62921: Calling all_plugins_play to load vars for managed_node2 27712 1727096511.62924: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096511.62926: Calling groups_plugins_play to load vars for managed_node2 27712 1727096511.64016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096511.65149: done with get_vars() 27712 1727096511.65177: done getting variables 27712 1727096511.65245: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 09:01:51 -0400 (0:00:00.414) 0:00:37.346 ****** 27712 1727096511.65275: entering _queue_task() for managed_node2/set_fact 27712 1727096511.65600: worker is 1 (out of 1 available) 27712 1727096511.65614: exiting _queue_task() for managed_node2/set_fact 27712 1727096511.65630: done queuing things up, now waiting for results queue to drain 27712 1727096511.65632: waiting for pending results... 27712 1727096511.65926: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 27712 1727096511.66052: in run() - task 0afff68d-5257-cbc7-8716-000000000b40 27712 1727096511.66056: variable 'ansible_search_path' from source: unknown 27712 1727096511.66063: variable 'ansible_search_path' from source: unknown 27712 1727096511.66149: calling self._execute() 27712 1727096511.66214: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.66219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.66228: variable 'omit' from source: magic vars 27712 1727096511.66639: variable 'ansible_distribution_major_version' from source: facts 27712 1727096511.66659: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096511.66794: variable 'nm_profile_exists' from source: set_fact 27712 1727096511.66831: Evaluated conditional (nm_profile_exists.rc == 0): False 27712 1727096511.66834: when evaluation is False, skipping this task 27712 1727096511.66837: _execute() done 27712 1727096511.66842: dumping result to json 27712 1727096511.66845: done dumping result, returning 27712 1727096511.66847: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-cbc7-8716-000000000b40] 27712 1727096511.66849: sending task result for task 0afff68d-5257-cbc7-8716-000000000b40 27712 1727096511.66972: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b40 27712 1727096511.66975: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 27712 1727096511.67026: no more pending results, returning what we have 27712 1727096511.67030: results queue empty 27712 1727096511.67031: checking for any_errors_fatal 27712 1727096511.67038: done checking for any_errors_fatal 27712 1727096511.67038: checking for max_fail_percentage 27712 1727096511.67040: done checking for max_fail_percentage 27712 1727096511.67040: checking to see if all hosts have failed and the running result is not ok 27712 1727096511.67041: done checking to see if all hosts have failed 27712 1727096511.67042: getting the remaining hosts for this loop 27712 1727096511.67043: done getting the remaining hosts for this loop 27712 1727096511.67048: getting the next task for host managed_node2 27712 1727096511.67058: done getting next task for host managed_node2 27712 1727096511.67060: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 27712 1727096511.67065: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096511.67070: getting variables 27712 1727096511.67072: in VariableManager get_vars() 27712 1727096511.67115: Calling all_inventory to load vars for managed_node2 27712 1727096511.67119: Calling groups_inventory to load vars for managed_node2 27712 1727096511.67121: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096511.67132: Calling all_plugins_play to load vars for managed_node2 27712 1727096511.67134: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096511.67137: Calling groups_plugins_play to load vars for managed_node2 27712 1727096511.68198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096511.69161: done with get_vars() 27712 1727096511.69185: done getting variables 27712 1727096511.69251: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096511.69379: variable 'profile' from source: include params 27712 1727096511.69383: variable 'item' from source: include params 27712 1727096511.69458: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 09:01:51 -0400 (0:00:00.042) 0:00:37.388 ****** 27712 1727096511.69486: entering _queue_task() for managed_node2/command 27712 1727096511.69761: worker is 1 (out of 1 available) 27712 1727096511.69776: exiting _queue_task() for managed_node2/command 27712 1727096511.69787: done queuing things up, now waiting for results queue to drain 27712 1727096511.69789: waiting for pending results... 27712 1727096511.70059: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-ethtest0 27712 1727096511.70169: in run() - task 0afff68d-5257-cbc7-8716-000000000b42 27712 1727096511.70184: variable 'ansible_search_path' from source: unknown 27712 1727096511.70192: variable 'ansible_search_path' from source: unknown 27712 1727096511.70238: calling self._execute() 27712 1727096511.70361: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.70386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.70390: variable 'omit' from source: magic vars 27712 1727096511.70822: variable 'ansible_distribution_major_version' from source: facts 27712 1727096511.70835: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096511.70996: variable 'profile_stat' from source: set_fact 27712 1727096511.70999: Evaluated conditional (profile_stat.stat.exists): False 27712 1727096511.71005: when evaluation is False, skipping this task 27712 1727096511.71008: _execute() done 27712 1727096511.71010: dumping result to json 27712 1727096511.71013: done dumping result, returning 27712 1727096511.71016: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [0afff68d-5257-cbc7-8716-000000000b42] 27712 1727096511.71018: sending task result for task 0afff68d-5257-cbc7-8716-000000000b42 27712 1727096511.71174: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b42 27712 1727096511.71177: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27712 1727096511.71240: no more pending results, returning what we have 27712 1727096511.71244: results queue empty 27712 1727096511.71245: checking for any_errors_fatal 27712 1727096511.71252: done checking for any_errors_fatal 27712 1727096511.71253: checking for max_fail_percentage 27712 1727096511.71254: done checking for max_fail_percentage 27712 1727096511.71255: checking to see if all hosts have failed and the running result is not ok 27712 1727096511.71256: done checking to see if all hosts have failed 27712 1727096511.71256: getting the remaining hosts for this loop 27712 1727096511.71258: done getting the remaining hosts for this loop 27712 1727096511.71262: getting the next task for host managed_node2 27712 1727096511.71274: done getting next task for host managed_node2 27712 1727096511.71276: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 27712 1727096511.71281: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096511.71285: getting variables 27712 1727096511.71286: in VariableManager get_vars() 27712 1727096511.71323: Calling all_inventory to load vars for managed_node2 27712 1727096511.71327: Calling groups_inventory to load vars for managed_node2 27712 1727096511.71329: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096511.71339: Calling all_plugins_play to load vars for managed_node2 27712 1727096511.71341: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096511.71344: Calling groups_plugins_play to load vars for managed_node2 27712 1727096511.72746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096511.73904: done with get_vars() 27712 1727096511.73939: done getting variables 27712 1727096511.74021: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096511.74155: variable 'profile' from source: include params 27712 1727096511.74160: variable 'item' from source: include params 27712 1727096511.74219: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 09:01:51 -0400 (0:00:00.047) 0:00:37.435 ****** 27712 1727096511.74246: entering _queue_task() for managed_node2/set_fact 27712 1727096511.74533: worker is 1 (out of 1 available) 27712 1727096511.74547: exiting _queue_task() for managed_node2/set_fact 27712 1727096511.74560: done queuing things up, now waiting for results queue to drain 27712 1727096511.74561: waiting for pending results... 27712 1727096511.74745: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 27712 1727096511.74829: in run() - task 0afff68d-5257-cbc7-8716-000000000b43 27712 1727096511.74842: variable 'ansible_search_path' from source: unknown 27712 1727096511.74845: variable 'ansible_search_path' from source: unknown 27712 1727096511.74878: calling self._execute() 27712 1727096511.74957: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.74961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.74970: variable 'omit' from source: magic vars 27712 1727096511.75254: variable 'ansible_distribution_major_version' from source: facts 27712 1727096511.75263: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096511.75349: variable 'profile_stat' from source: set_fact 27712 1727096511.75358: Evaluated conditional (profile_stat.stat.exists): False 27712 1727096511.75361: when evaluation is False, skipping this task 27712 1727096511.75364: _execute() done 27712 1727096511.75366: dumping result to json 27712 1727096511.75374: done dumping result, returning 27712 1727096511.75378: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [0afff68d-5257-cbc7-8716-000000000b43] 27712 1727096511.75383: sending task result for task 0afff68d-5257-cbc7-8716-000000000b43 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27712 1727096511.75523: no more pending results, returning what we have 27712 1727096511.75527: results queue empty 27712 1727096511.75528: checking for any_errors_fatal 27712 1727096511.75537: done checking for any_errors_fatal 27712 1727096511.75537: checking for max_fail_percentage 27712 1727096511.75539: done checking for max_fail_percentage 27712 1727096511.75540: checking to see if all hosts have failed and the running result is not ok 27712 1727096511.75541: done checking to see if all hosts have failed 27712 1727096511.75541: getting the remaining hosts for this loop 27712 1727096511.75543: done getting the remaining hosts for this loop 27712 1727096511.75547: getting the next task for host managed_node2 27712 1727096511.75554: done getting next task for host managed_node2 27712 1727096511.75557: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 27712 1727096511.75562: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096511.75566: getting variables 27712 1727096511.75569: in VariableManager get_vars() 27712 1727096511.75615: Calling all_inventory to load vars for managed_node2 27712 1727096511.75618: Calling groups_inventory to load vars for managed_node2 27712 1727096511.75620: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096511.75631: Calling all_plugins_play to load vars for managed_node2 27712 1727096511.75633: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096511.75635: Calling groups_plugins_play to load vars for managed_node2 27712 1727096511.76181: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b43 27712 1727096511.76184: WORKER PROCESS EXITING 27712 1727096511.76483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096511.77360: done with get_vars() 27712 1727096511.77382: done getting variables 27712 1727096511.77428: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096511.77513: variable 'profile' from source: include params 27712 1727096511.77516: variable 'item' from source: include params 27712 1727096511.77558: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 09:01:51 -0400 (0:00:00.033) 0:00:37.469 ****** 27712 1727096511.77584: entering _queue_task() for managed_node2/command 27712 1727096511.77851: worker is 1 (out of 1 available) 27712 1727096511.77864: exiting _queue_task() for managed_node2/command 27712 1727096511.77876: done queuing things up, now waiting for results queue to drain 27712 1727096511.77878: waiting for pending results... 27712 1727096511.78070: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-ethtest0 27712 1727096511.78160: in run() - task 0afff68d-5257-cbc7-8716-000000000b44 27712 1727096511.78177: variable 'ansible_search_path' from source: unknown 27712 1727096511.78181: variable 'ansible_search_path' from source: unknown 27712 1727096511.78213: calling self._execute() 27712 1727096511.78289: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.78293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.78302: variable 'omit' from source: magic vars 27712 1727096511.78576: variable 'ansible_distribution_major_version' from source: facts 27712 1727096511.78584: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096511.78673: variable 'profile_stat' from source: set_fact 27712 1727096511.78681: Evaluated conditional (profile_stat.stat.exists): False 27712 1727096511.78683: when evaluation is False, skipping this task 27712 1727096511.78687: _execute() done 27712 1727096511.78689: dumping result to json 27712 1727096511.78694: done dumping result, returning 27712 1727096511.78699: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-ethtest0 [0afff68d-5257-cbc7-8716-000000000b44] 27712 1727096511.78705: sending task result for task 0afff68d-5257-cbc7-8716-000000000b44 27712 1727096511.78793: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b44 27712 1727096511.78795: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27712 1727096511.78849: no more pending results, returning what we have 27712 1727096511.78852: results queue empty 27712 1727096511.78854: checking for any_errors_fatal 27712 1727096511.78860: done checking for any_errors_fatal 27712 1727096511.78861: checking for max_fail_percentage 27712 1727096511.78863: done checking for max_fail_percentage 27712 1727096511.78864: checking to see if all hosts have failed and the running result is not ok 27712 1727096511.78865: done checking to see if all hosts have failed 27712 1727096511.78865: getting the remaining hosts for this loop 27712 1727096511.78869: done getting the remaining hosts for this loop 27712 1727096511.78875: getting the next task for host managed_node2 27712 1727096511.78882: done getting next task for host managed_node2 27712 1727096511.78884: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 27712 1727096511.78889: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096511.78893: getting variables 27712 1727096511.78894: in VariableManager get_vars() 27712 1727096511.78935: Calling all_inventory to load vars for managed_node2 27712 1727096511.78937: Calling groups_inventory to load vars for managed_node2 27712 1727096511.78939: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096511.78949: Calling all_plugins_play to load vars for managed_node2 27712 1727096511.78951: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096511.78953: Calling groups_plugins_play to load vars for managed_node2 27712 1727096511.79895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096511.80763: done with get_vars() 27712 1727096511.80786: done getting variables 27712 1727096511.80832: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096511.80914: variable 'profile' from source: include params 27712 1727096511.80917: variable 'item' from source: include params 27712 1727096511.80961: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 09:01:51 -0400 (0:00:00.034) 0:00:37.503 ****** 27712 1727096511.80989: entering _queue_task() for managed_node2/set_fact 27712 1727096511.81253: worker is 1 (out of 1 available) 27712 1727096511.81269: exiting _queue_task() for managed_node2/set_fact 27712 1727096511.81284: done queuing things up, now waiting for results queue to drain 27712 1727096511.81285: waiting for pending results... 27712 1727096511.81460: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-ethtest0 27712 1727096511.81551: in run() - task 0afff68d-5257-cbc7-8716-000000000b45 27712 1727096511.81564: variable 'ansible_search_path' from source: unknown 27712 1727096511.81569: variable 'ansible_search_path' from source: unknown 27712 1727096511.81598: calling self._execute() 27712 1727096511.81678: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.81682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.81690: variable 'omit' from source: magic vars 27712 1727096511.81955: variable 'ansible_distribution_major_version' from source: facts 27712 1727096511.81965: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096511.82047: variable 'profile_stat' from source: set_fact 27712 1727096511.82060: Evaluated conditional (profile_stat.stat.exists): False 27712 1727096511.82065: when evaluation is False, skipping this task 27712 1727096511.82070: _execute() done 27712 1727096511.82075: dumping result to json 27712 1727096511.82077: done dumping result, returning 27712 1727096511.82080: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [0afff68d-5257-cbc7-8716-000000000b45] 27712 1727096511.82082: sending task result for task 0afff68d-5257-cbc7-8716-000000000b45 27712 1727096511.82169: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b45 27712 1727096511.82175: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27712 1727096511.82238: no more pending results, returning what we have 27712 1727096511.82241: results queue empty 27712 1727096511.82242: checking for any_errors_fatal 27712 1727096511.82251: done checking for any_errors_fatal 27712 1727096511.82252: checking for max_fail_percentage 27712 1727096511.82253: done checking for max_fail_percentage 27712 1727096511.82254: checking to see if all hosts have failed and the running result is not ok 27712 1727096511.82255: done checking to see if all hosts have failed 27712 1727096511.82255: getting the remaining hosts for this loop 27712 1727096511.82257: done getting the remaining hosts for this loop 27712 1727096511.82260: getting the next task for host managed_node2 27712 1727096511.82270: done getting next task for host managed_node2 27712 1727096511.82275: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 27712 1727096511.82279: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096511.82284: getting variables 27712 1727096511.82286: in VariableManager get_vars() 27712 1727096511.82323: Calling all_inventory to load vars for managed_node2 27712 1727096511.82326: Calling groups_inventory to load vars for managed_node2 27712 1727096511.82328: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096511.82337: Calling all_plugins_play to load vars for managed_node2 27712 1727096511.82339: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096511.82341: Calling groups_plugins_play to load vars for managed_node2 27712 1727096511.83136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096511.83994: done with get_vars() 27712 1727096511.84013: done getting variables 27712 1727096511.84053: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096511.84134: variable 'profile' from source: include params 27712 1727096511.84137: variable 'item' from source: include params 27712 1727096511.84177: variable 'item' from source: include params TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Monday 23 September 2024 09:01:51 -0400 (0:00:00.032) 0:00:37.535 ****** 27712 1727096511.84199: entering _queue_task() for managed_node2/assert 27712 1727096511.84427: worker is 1 (out of 1 available) 27712 1727096511.84440: exiting _queue_task() for managed_node2/assert 27712 1727096511.84452: done queuing things up, now waiting for results queue to drain 27712 1727096511.84454: waiting for pending results... 27712 1727096511.84628: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'ethtest0' 27712 1727096511.84708: in run() - task 0afff68d-5257-cbc7-8716-000000000a6d 27712 1727096511.84720: variable 'ansible_search_path' from source: unknown 27712 1727096511.84724: variable 'ansible_search_path' from source: unknown 27712 1727096511.84752: calling self._execute() 27712 1727096511.84833: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.84837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.84845: variable 'omit' from source: magic vars 27712 1727096511.85115: variable 'ansible_distribution_major_version' from source: facts 27712 1727096511.85130: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096511.85133: variable 'omit' from source: magic vars 27712 1727096511.85164: variable 'omit' from source: magic vars 27712 1727096511.85236: variable 'profile' from source: include params 27712 1727096511.85239: variable 'item' from source: include params 27712 1727096511.85287: variable 'item' from source: include params 27712 1727096511.85302: variable 'omit' from source: magic vars 27712 1727096511.85335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096511.85364: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096511.85383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096511.85396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096511.85406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096511.85428: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096511.85431: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.85435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.85508: Set connection var ansible_connection to ssh 27712 1727096511.85515: Set connection var ansible_pipelining to False 27712 1727096511.85520: Set connection var ansible_timeout to 10 27712 1727096511.85523: Set connection var ansible_shell_type to sh 27712 1727096511.85529: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096511.85534: Set connection var ansible_shell_executable to /bin/sh 27712 1727096511.85550: variable 'ansible_shell_executable' from source: unknown 27712 1727096511.85553: variable 'ansible_connection' from source: unknown 27712 1727096511.85555: variable 'ansible_module_compression' from source: unknown 27712 1727096511.85559: variable 'ansible_shell_type' from source: unknown 27712 1727096511.85561: variable 'ansible_shell_executable' from source: unknown 27712 1727096511.85565: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.85568: variable 'ansible_pipelining' from source: unknown 27712 1727096511.85571: variable 'ansible_timeout' from source: unknown 27712 1727096511.85573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.85672: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096511.85685: variable 'omit' from source: magic vars 27712 1727096511.85689: starting attempt loop 27712 1727096511.85692: running the handler 27712 1727096511.85771: variable 'lsr_net_profile_exists' from source: set_fact 27712 1727096511.85777: Evaluated conditional (not lsr_net_profile_exists): True 27712 1727096511.85783: handler run complete 27712 1727096511.85794: attempt loop complete, returning result 27712 1727096511.85796: _execute() done 27712 1727096511.85799: dumping result to json 27712 1727096511.85803: done dumping result, returning 27712 1727096511.85815: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'ethtest0' [0afff68d-5257-cbc7-8716-000000000a6d] 27712 1727096511.85817: sending task result for task 0afff68d-5257-cbc7-8716-000000000a6d 27712 1727096511.85897: done sending task result for task 0afff68d-5257-cbc7-8716-000000000a6d 27712 1727096511.85900: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27712 1727096511.85971: no more pending results, returning what we have 27712 1727096511.85974: results queue empty 27712 1727096511.85975: checking for any_errors_fatal 27712 1727096511.85981: done checking for any_errors_fatal 27712 1727096511.85982: checking for max_fail_percentage 27712 1727096511.85983: done checking for max_fail_percentage 27712 1727096511.85984: checking to see if all hosts have failed and the running result is not ok 27712 1727096511.85985: done checking to see if all hosts have failed 27712 1727096511.85986: getting the remaining hosts for this loop 27712 1727096511.85987: done getting the remaining hosts for this loop 27712 1727096511.85990: getting the next task for host managed_node2 27712 1727096511.85998: done getting next task for host managed_node2 27712 1727096511.86001: ^ task is: TASK: Include the task 'get_profile_stat.yml' 27712 1727096511.86005: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096511.86008: getting variables 27712 1727096511.86009: in VariableManager get_vars() 27712 1727096511.86048: Calling all_inventory to load vars for managed_node2 27712 1727096511.86050: Calling groups_inventory to load vars for managed_node2 27712 1727096511.86053: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096511.86061: Calling all_plugins_play to load vars for managed_node2 27712 1727096511.86063: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096511.86066: Calling groups_plugins_play to load vars for managed_node2 27712 1727096511.86932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096511.87795: done with get_vars() 27712 1727096511.87811: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Monday 23 September 2024 09:01:51 -0400 (0:00:00.036) 0:00:37.572 ****** 27712 1727096511.87880: entering _queue_task() for managed_node2/include_tasks 27712 1727096511.88112: worker is 1 (out of 1 available) 27712 1727096511.88125: exiting _queue_task() for managed_node2/include_tasks 27712 1727096511.88138: done queuing things up, now waiting for results queue to drain 27712 1727096511.88139: waiting for pending results... 27712 1727096511.88312: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 27712 1727096511.88397: in run() - task 0afff68d-5257-cbc7-8716-000000000a71 27712 1727096511.88410: variable 'ansible_search_path' from source: unknown 27712 1727096511.88413: variable 'ansible_search_path' from source: unknown 27712 1727096511.88439: calling self._execute() 27712 1727096511.88518: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.88522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.88530: variable 'omit' from source: magic vars 27712 1727096511.88799: variable 'ansible_distribution_major_version' from source: facts 27712 1727096511.88814: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096511.88817: _execute() done 27712 1727096511.88820: dumping result to json 27712 1727096511.88823: done dumping result, returning 27712 1727096511.88828: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-cbc7-8716-000000000a71] 27712 1727096511.88834: sending task result for task 0afff68d-5257-cbc7-8716-000000000a71 27712 1727096511.88927: done sending task result for task 0afff68d-5257-cbc7-8716-000000000a71 27712 1727096511.88930: WORKER PROCESS EXITING 27712 1727096511.88955: no more pending results, returning what we have 27712 1727096511.88959: in VariableManager get_vars() 27712 1727096511.89008: Calling all_inventory to load vars for managed_node2 27712 1727096511.89011: Calling groups_inventory to load vars for managed_node2 27712 1727096511.89013: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096511.89027: Calling all_plugins_play to load vars for managed_node2 27712 1727096511.89030: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096511.89032: Calling groups_plugins_play to load vars for managed_node2 27712 1727096511.89807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096511.90770: done with get_vars() 27712 1727096511.90785: variable 'ansible_search_path' from source: unknown 27712 1727096511.90786: variable 'ansible_search_path' from source: unknown 27712 1727096511.90811: we have included files to process 27712 1727096511.90811: generating all_blocks data 27712 1727096511.90813: done generating all_blocks data 27712 1727096511.90817: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27712 1727096511.90817: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27712 1727096511.90819: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 27712 1727096511.91423: done processing included file 27712 1727096511.91425: iterating over new_blocks loaded from include file 27712 1727096511.91426: in VariableManager get_vars() 27712 1727096511.91440: done with get_vars() 27712 1727096511.91441: filtering new block on tags 27712 1727096511.91485: done filtering new block on tags 27712 1727096511.91487: in VariableManager get_vars() 27712 1727096511.91498: done with get_vars() 27712 1727096511.91499: filtering new block on tags 27712 1727096511.91533: done filtering new block on tags 27712 1727096511.91535: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 27712 1727096511.91539: extending task lists for all hosts with included blocks 27712 1727096511.91609: done extending task lists 27712 1727096511.91610: done processing included files 27712 1727096511.91611: results queue empty 27712 1727096511.91611: checking for any_errors_fatal 27712 1727096511.91614: done checking for any_errors_fatal 27712 1727096511.91614: checking for max_fail_percentage 27712 1727096511.91615: done checking for max_fail_percentage 27712 1727096511.91615: checking to see if all hosts have failed and the running result is not ok 27712 1727096511.91616: done checking to see if all hosts have failed 27712 1727096511.91616: getting the remaining hosts for this loop 27712 1727096511.91617: done getting the remaining hosts for this loop 27712 1727096511.91618: getting the next task for host managed_node2 27712 1727096511.91622: done getting next task for host managed_node2 27712 1727096511.91624: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 27712 1727096511.91627: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096511.91629: getting variables 27712 1727096511.91630: in VariableManager get_vars() 27712 1727096511.91639: Calling all_inventory to load vars for managed_node2 27712 1727096511.91640: Calling groups_inventory to load vars for managed_node2 27712 1727096511.91641: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096511.91645: Calling all_plugins_play to load vars for managed_node2 27712 1727096511.91646: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096511.91648: Calling groups_plugins_play to load vars for managed_node2 27712 1727096511.92315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096511.93166: done with get_vars() 27712 1727096511.93184: done getting variables 27712 1727096511.93209: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 09:01:51 -0400 (0:00:00.053) 0:00:37.625 ****** 27712 1727096511.93230: entering _queue_task() for managed_node2/set_fact 27712 1727096511.93473: worker is 1 (out of 1 available) 27712 1727096511.93485: exiting _queue_task() for managed_node2/set_fact 27712 1727096511.93497: done queuing things up, now waiting for results queue to drain 27712 1727096511.93498: waiting for pending results... 27712 1727096511.93670: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 27712 1727096511.93745: in run() - task 0afff68d-5257-cbc7-8716-000000000b79 27712 1727096511.93756: variable 'ansible_search_path' from source: unknown 27712 1727096511.93759: variable 'ansible_search_path' from source: unknown 27712 1727096511.93789: calling self._execute() 27712 1727096511.93865: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.93870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.93880: variable 'omit' from source: magic vars 27712 1727096511.94141: variable 'ansible_distribution_major_version' from source: facts 27712 1727096511.94151: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096511.94161: variable 'omit' from source: magic vars 27712 1727096511.94201: variable 'omit' from source: magic vars 27712 1727096511.94225: variable 'omit' from source: magic vars 27712 1727096511.94257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096511.94291: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096511.94307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096511.94320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096511.94329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096511.94351: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096511.94354: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.94356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.94431: Set connection var ansible_connection to ssh 27712 1727096511.94437: Set connection var ansible_pipelining to False 27712 1727096511.94443: Set connection var ansible_timeout to 10 27712 1727096511.94446: Set connection var ansible_shell_type to sh 27712 1727096511.94452: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096511.94457: Set connection var ansible_shell_executable to /bin/sh 27712 1727096511.94476: variable 'ansible_shell_executable' from source: unknown 27712 1727096511.94481: variable 'ansible_connection' from source: unknown 27712 1727096511.94484: variable 'ansible_module_compression' from source: unknown 27712 1727096511.94487: variable 'ansible_shell_type' from source: unknown 27712 1727096511.94490: variable 'ansible_shell_executable' from source: unknown 27712 1727096511.94493: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.94496: variable 'ansible_pipelining' from source: unknown 27712 1727096511.94498: variable 'ansible_timeout' from source: unknown 27712 1727096511.94500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.94594: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096511.94602: variable 'omit' from source: magic vars 27712 1727096511.94617: starting attempt loop 27712 1727096511.94626: running the handler 27712 1727096511.94629: handler run complete 27712 1727096511.94631: attempt loop complete, returning result 27712 1727096511.94634: _execute() done 27712 1727096511.94639: dumping result to json 27712 1727096511.94641: done dumping result, returning 27712 1727096511.94648: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-cbc7-8716-000000000b79] 27712 1727096511.94653: sending task result for task 0afff68d-5257-cbc7-8716-000000000b79 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 27712 1727096511.94792: no more pending results, returning what we have 27712 1727096511.94795: results queue empty 27712 1727096511.94796: checking for any_errors_fatal 27712 1727096511.94797: done checking for any_errors_fatal 27712 1727096511.94798: checking for max_fail_percentage 27712 1727096511.94800: done checking for max_fail_percentage 27712 1727096511.94801: checking to see if all hosts have failed and the running result is not ok 27712 1727096511.94802: done checking to see if all hosts have failed 27712 1727096511.94802: getting the remaining hosts for this loop 27712 1727096511.94804: done getting the remaining hosts for this loop 27712 1727096511.94807: getting the next task for host managed_node2 27712 1727096511.94814: done getting next task for host managed_node2 27712 1727096511.94816: ^ task is: TASK: Stat profile file 27712 1727096511.94821: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096511.94825: getting variables 27712 1727096511.94826: in VariableManager get_vars() 27712 1727096511.94875: Calling all_inventory to load vars for managed_node2 27712 1727096511.94878: Calling groups_inventory to load vars for managed_node2 27712 1727096511.94880: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096511.94885: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b79 27712 1727096511.94887: WORKER PROCESS EXITING 27712 1727096511.94895: Calling all_plugins_play to load vars for managed_node2 27712 1727096511.94898: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096511.94900: Calling groups_plugins_play to load vars for managed_node2 27712 1727096511.95745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096511.96605: done with get_vars() 27712 1727096511.96620: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 09:01:51 -0400 (0:00:00.034) 0:00:37.660 ****** 27712 1727096511.96683: entering _queue_task() for managed_node2/stat 27712 1727096511.96895: worker is 1 (out of 1 available) 27712 1727096511.96906: exiting _queue_task() for managed_node2/stat 27712 1727096511.96919: done queuing things up, now waiting for results queue to drain 27712 1727096511.96920: waiting for pending results... 27712 1727096511.97098: running TaskExecutor() for managed_node2/TASK: Stat profile file 27712 1727096511.97170: in run() - task 0afff68d-5257-cbc7-8716-000000000b7a 27712 1727096511.97182: variable 'ansible_search_path' from source: unknown 27712 1727096511.97186: variable 'ansible_search_path' from source: unknown 27712 1727096511.97214: calling self._execute() 27712 1727096511.97296: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.97300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.97309: variable 'omit' from source: magic vars 27712 1727096511.97577: variable 'ansible_distribution_major_version' from source: facts 27712 1727096511.97587: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096511.97590: variable 'omit' from source: magic vars 27712 1727096511.97626: variable 'omit' from source: magic vars 27712 1727096511.97697: variable 'profile' from source: include params 27712 1727096511.97702: variable 'item' from source: include params 27712 1727096511.97744: variable 'item' from source: include params 27712 1727096511.97759: variable 'omit' from source: magic vars 27712 1727096511.97791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096511.97821: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096511.97835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096511.97849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096511.97858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096511.97882: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096511.97885: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.97888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.97957: Set connection var ansible_connection to ssh 27712 1727096511.97964: Set connection var ansible_pipelining to False 27712 1727096511.97973: Set connection var ansible_timeout to 10 27712 1727096511.97976: Set connection var ansible_shell_type to sh 27712 1727096511.97981: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096511.97985: Set connection var ansible_shell_executable to /bin/sh 27712 1727096511.98001: variable 'ansible_shell_executable' from source: unknown 27712 1727096511.98004: variable 'ansible_connection' from source: unknown 27712 1727096511.98006: variable 'ansible_module_compression' from source: unknown 27712 1727096511.98009: variable 'ansible_shell_type' from source: unknown 27712 1727096511.98011: variable 'ansible_shell_executable' from source: unknown 27712 1727096511.98013: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096511.98020: variable 'ansible_pipelining' from source: unknown 27712 1727096511.98023: variable 'ansible_timeout' from source: unknown 27712 1727096511.98025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096511.98266: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 27712 1727096511.98299: variable 'omit' from source: magic vars 27712 1727096511.98301: starting attempt loop 27712 1727096511.98330: running the handler 27712 1727096511.98369: _low_level_execute_command(): starting 27712 1727096511.98372: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096511.99194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096511.99198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096511.99201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096511.99249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096511.99262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096511.99316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096512.01081: stdout chunk (state=3): >>>/root <<< 27712 1727096512.01203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096512.01210: stdout chunk (state=3): >>><<< 27712 1727096512.01273: stderr chunk (state=3): >>><<< 27712 1727096512.01283: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096512.01303: _low_level_execute_command(): starting 27712 1727096512.01313: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096512.0128446-29462-226875536789342 `" && echo ansible-tmp-1727096512.0128446-29462-226875536789342="` echo /root/.ansible/tmp/ansible-tmp-1727096512.0128446-29462-226875536789342 `" ) && sleep 0' 27712 1727096512.02583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096512.02660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096512.02664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096512.02669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096512.02684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096512.02687: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096512.02690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.02744: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096512.02748: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096512.02750: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27712 1727096512.02752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096512.02754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096512.02854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.02857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096512.02860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096512.02866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096512.02959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096512.05004: stdout chunk (state=3): >>>ansible-tmp-1727096512.0128446-29462-226875536789342=/root/.ansible/tmp/ansible-tmp-1727096512.0128446-29462-226875536789342 <<< 27712 1727096512.05086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096512.05195: stderr chunk (state=3): >>><<< 27712 1727096512.05200: stdout chunk (state=3): >>><<< 27712 1727096512.05346: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096512.0128446-29462-226875536789342=/root/.ansible/tmp/ansible-tmp-1727096512.0128446-29462-226875536789342 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096512.05390: variable 'ansible_module_compression' from source: unknown 27712 1727096512.05461: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 27712 1727096512.05492: variable 'ansible_facts' from source: unknown 27712 1727096512.05552: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096512.0128446-29462-226875536789342/AnsiballZ_stat.py 27712 1727096512.05693: Sending initial data 27712 1727096512.05696: Sent initial data (153 bytes) 27712 1727096512.06222: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096512.06226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096512.06228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.06231: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096512.06233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096512.06234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.06286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096512.06289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096512.06330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096512.08015: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096512.08051: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096512.08177: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpqq4x42k7 /root/.ansible/tmp/ansible-tmp-1727096512.0128446-29462-226875536789342/AnsiballZ_stat.py <<< 27712 1727096512.08180: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096512.0128446-29462-226875536789342/AnsiballZ_stat.py" <<< 27712 1727096512.08217: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpqq4x42k7" to remote "/root/.ansible/tmp/ansible-tmp-1727096512.0128446-29462-226875536789342/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096512.0128446-29462-226875536789342/AnsiballZ_stat.py" <<< 27712 1727096512.08817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096512.08873: stderr chunk (state=3): >>><<< 27712 1727096512.08878: stdout chunk (state=3): >>><<< 27712 1727096512.08881: done transferring module to remote 27712 1727096512.08883: _low_level_execute_command(): starting 27712 1727096512.08887: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096512.0128446-29462-226875536789342/ /root/.ansible/tmp/ansible-tmp-1727096512.0128446-29462-226875536789342/AnsiballZ_stat.py && sleep 0' 27712 1727096512.09340: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096512.09344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096512.09350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27712 1727096512.09353: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096512.09355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.09400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096512.09403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096512.09407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096512.09441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096512.11405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096512.11409: stdout chunk (state=3): >>><<< 27712 1727096512.11411: stderr chunk (state=3): >>><<< 27712 1727096512.11414: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096512.11416: _low_level_execute_command(): starting 27712 1727096512.11418: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096512.0128446-29462-226875536789342/AnsiballZ_stat.py && sleep 0' 27712 1727096512.11998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096512.12249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096512.12261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096512.12264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096512.12267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096512.12273: stderr chunk (state=3): >>>debug2: match not found <<< 27712 1727096512.12275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.12277: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096512.12279: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address <<< 27712 1727096512.12281: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 27712 1727096512.12283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096512.12285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.12287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096512.12288: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096512.12290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096512.12355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096512.28082: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 27712 1727096512.29685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096512.29690: stdout chunk (state=3): >>><<< 27712 1727096512.29692: stderr chunk (state=3): >>><<< 27712 1727096512.29694: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096512.29697: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096512.0128446-29462-226875536789342/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096512.29699: _low_level_execute_command(): starting 27712 1727096512.29701: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096512.0128446-29462-226875536789342/ > /dev/null 2>&1 && sleep 0' 27712 1727096512.30300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096512.30318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096512.30347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096512.30367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096512.30388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096512.30459: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.30507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096512.30526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096512.30561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096512.30625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096512.32608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096512.32622: stdout chunk (state=3): >>><<< 27712 1727096512.32635: stderr chunk (state=3): >>><<< 27712 1727096512.32674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096512.32687: handler run complete 27712 1727096512.32714: attempt loop complete, returning result 27712 1727096512.32721: _execute() done 27712 1727096512.32729: dumping result to json 27712 1727096512.32873: done dumping result, returning 27712 1727096512.32877: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0afff68d-5257-cbc7-8716-000000000b7a] 27712 1727096512.32879: sending task result for task 0afff68d-5257-cbc7-8716-000000000b7a 27712 1727096512.32952: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b7a 27712 1727096512.32956: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 27712 1727096512.33022: no more pending results, returning what we have 27712 1727096512.33026: results queue empty 27712 1727096512.33027: checking for any_errors_fatal 27712 1727096512.33034: done checking for any_errors_fatal 27712 1727096512.33035: checking for max_fail_percentage 27712 1727096512.33037: done checking for max_fail_percentage 27712 1727096512.33038: checking to see if all hosts have failed and the running result is not ok 27712 1727096512.33039: done checking to see if all hosts have failed 27712 1727096512.33039: getting the remaining hosts for this loop 27712 1727096512.33041: done getting the remaining hosts for this loop 27712 1727096512.33044: getting the next task for host managed_node2 27712 1727096512.33053: done getting next task for host managed_node2 27712 1727096512.33056: ^ task is: TASK: Set NM profile exist flag based on the profile files 27712 1727096512.33062: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096512.33069: getting variables 27712 1727096512.33071: in VariableManager get_vars() 27712 1727096512.33115: Calling all_inventory to load vars for managed_node2 27712 1727096512.33119: Calling groups_inventory to load vars for managed_node2 27712 1727096512.33122: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096512.33134: Calling all_plugins_play to load vars for managed_node2 27712 1727096512.33138: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096512.33141: Calling groups_plugins_play to load vars for managed_node2 27712 1727096512.34901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096512.36611: done with get_vars() 27712 1727096512.36642: done getting variables 27712 1727096512.36720: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 09:01:52 -0400 (0:00:00.400) 0:00:38.060 ****** 27712 1727096512.36755: entering _queue_task() for managed_node2/set_fact 27712 1727096512.37153: worker is 1 (out of 1 available) 27712 1727096512.37166: exiting _queue_task() for managed_node2/set_fact 27712 1727096512.37183: done queuing things up, now waiting for results queue to drain 27712 1727096512.37185: waiting for pending results... 27712 1727096512.37592: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 27712 1727096512.37675: in run() - task 0afff68d-5257-cbc7-8716-000000000b7b 27712 1727096512.37694: variable 'ansible_search_path' from source: unknown 27712 1727096512.37697: variable 'ansible_search_path' from source: unknown 27712 1727096512.37708: calling self._execute() 27712 1727096512.37824: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096512.37836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096512.37907: variable 'omit' from source: magic vars 27712 1727096512.38275: variable 'ansible_distribution_major_version' from source: facts 27712 1727096512.38293: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096512.38431: variable 'profile_stat' from source: set_fact 27712 1727096512.38461: Evaluated conditional (profile_stat.stat.exists): False 27712 1727096512.38472: when evaluation is False, skipping this task 27712 1727096512.38480: _execute() done 27712 1727096512.38489: dumping result to json 27712 1727096512.38565: done dumping result, returning 27712 1727096512.38571: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-cbc7-8716-000000000b7b] 27712 1727096512.38573: sending task result for task 0afff68d-5257-cbc7-8716-000000000b7b 27712 1727096512.38646: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b7b 27712 1727096512.38650: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27712 1727096512.38718: no more pending results, returning what we have 27712 1727096512.38723: results queue empty 27712 1727096512.38724: checking for any_errors_fatal 27712 1727096512.38732: done checking for any_errors_fatal 27712 1727096512.38733: checking for max_fail_percentage 27712 1727096512.38735: done checking for max_fail_percentage 27712 1727096512.38736: checking to see if all hosts have failed and the running result is not ok 27712 1727096512.38737: done checking to see if all hosts have failed 27712 1727096512.38737: getting the remaining hosts for this loop 27712 1727096512.38739: done getting the remaining hosts for this loop 27712 1727096512.38742: getting the next task for host managed_node2 27712 1727096512.38751: done getting next task for host managed_node2 27712 1727096512.38754: ^ task is: TASK: Get NM profile info 27712 1727096512.38759: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096512.38763: getting variables 27712 1727096512.38765: in VariableManager get_vars() 27712 1727096512.38815: Calling all_inventory to load vars for managed_node2 27712 1727096512.38818: Calling groups_inventory to load vars for managed_node2 27712 1727096512.38821: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096512.38833: Calling all_plugins_play to load vars for managed_node2 27712 1727096512.38836: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096512.38839: Calling groups_plugins_play to load vars for managed_node2 27712 1727096512.40704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096512.42382: done with get_vars() 27712 1727096512.42409: done getting variables 27712 1727096512.42480: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 09:01:52 -0400 (0:00:00.057) 0:00:38.118 ****** 27712 1727096512.42513: entering _queue_task() for managed_node2/shell 27712 1727096512.43023: worker is 1 (out of 1 available) 27712 1727096512.43036: exiting _queue_task() for managed_node2/shell 27712 1727096512.43048: done queuing things up, now waiting for results queue to drain 27712 1727096512.43049: waiting for pending results... 27712 1727096512.43392: running TaskExecutor() for managed_node2/TASK: Get NM profile info 27712 1727096512.43397: in run() - task 0afff68d-5257-cbc7-8716-000000000b7c 27712 1727096512.43400: variable 'ansible_search_path' from source: unknown 27712 1727096512.43403: variable 'ansible_search_path' from source: unknown 27712 1727096512.43434: calling self._execute() 27712 1727096512.43548: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096512.43559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096512.43575: variable 'omit' from source: magic vars 27712 1727096512.43986: variable 'ansible_distribution_major_version' from source: facts 27712 1727096512.44003: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096512.44014: variable 'omit' from source: magic vars 27712 1727096512.44085: variable 'omit' from source: magic vars 27712 1727096512.44196: variable 'profile' from source: include params 27712 1727096512.44207: variable 'item' from source: include params 27712 1727096512.44287: variable 'item' from source: include params 27712 1727096512.44310: variable 'omit' from source: magic vars 27712 1727096512.44366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096512.44411: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096512.44437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096512.44489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096512.44496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096512.44525: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096512.44534: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096512.44579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096512.44657: Set connection var ansible_connection to ssh 27712 1727096512.44674: Set connection var ansible_pipelining to False 27712 1727096512.44695: Set connection var ansible_timeout to 10 27712 1727096512.44702: Set connection var ansible_shell_type to sh 27712 1727096512.44719: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096512.44729: Set connection var ansible_shell_executable to /bin/sh 27712 1727096512.44796: variable 'ansible_shell_executable' from source: unknown 27712 1727096512.44799: variable 'ansible_connection' from source: unknown 27712 1727096512.44801: variable 'ansible_module_compression' from source: unknown 27712 1727096512.44803: variable 'ansible_shell_type' from source: unknown 27712 1727096512.44804: variable 'ansible_shell_executable' from source: unknown 27712 1727096512.44806: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096512.44808: variable 'ansible_pipelining' from source: unknown 27712 1727096512.44811: variable 'ansible_timeout' from source: unknown 27712 1727096512.44812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096512.44962: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096512.45014: variable 'omit' from source: magic vars 27712 1727096512.45017: starting attempt loop 27712 1727096512.45019: running the handler 27712 1727096512.45021: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096512.45045: _low_level_execute_command(): starting 27712 1727096512.45060: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096512.45845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096512.45877: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096512.45902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096512.45930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096512.46019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.46059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096512.46079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096512.46111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096512.46179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096512.47875: stdout chunk (state=3): >>>/root <<< 27712 1727096512.47979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096512.48017: stderr chunk (state=3): >>><<< 27712 1727096512.48019: stdout chunk (state=3): >>><<< 27712 1727096512.48054: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096512.48059: _low_level_execute_command(): starting 27712 1727096512.48062: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096512.4803467-29482-245260328965776 `" && echo ansible-tmp-1727096512.4803467-29482-245260328965776="` echo /root/.ansible/tmp/ansible-tmp-1727096512.4803467-29482-245260328965776 `" ) && sleep 0' 27712 1727096512.48517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096512.48521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.48523: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096512.48525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.48563: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096512.48587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096512.48620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096512.50554: stdout chunk (state=3): >>>ansible-tmp-1727096512.4803467-29482-245260328965776=/root/.ansible/tmp/ansible-tmp-1727096512.4803467-29482-245260328965776 <<< 27712 1727096512.50774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096512.50778: stdout chunk (state=3): >>><<< 27712 1727096512.50781: stderr chunk (state=3): >>><<< 27712 1727096512.50783: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096512.4803467-29482-245260328965776=/root/.ansible/tmp/ansible-tmp-1727096512.4803467-29482-245260328965776 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096512.50785: variable 'ansible_module_compression' from source: unknown 27712 1727096512.50806: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096512.50844: variable 'ansible_facts' from source: unknown 27712 1727096512.50934: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096512.4803467-29482-245260328965776/AnsiballZ_command.py 27712 1727096512.51151: Sending initial data 27712 1727096512.51162: Sent initial data (156 bytes) 27712 1727096512.51561: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096512.51577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27712 1727096512.51594: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.51638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096512.51650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096512.51694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096512.53307: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096512.53337: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096512.53372: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmp35z_p2uj /root/.ansible/tmp/ansible-tmp-1727096512.4803467-29482-245260328965776/AnsiballZ_command.py <<< 27712 1727096512.53382: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096512.4803467-29482-245260328965776/AnsiballZ_command.py" <<< 27712 1727096512.53401: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmp35z_p2uj" to remote "/root/.ansible/tmp/ansible-tmp-1727096512.4803467-29482-245260328965776/AnsiballZ_command.py" <<< 27712 1727096512.53404: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096512.4803467-29482-245260328965776/AnsiballZ_command.py" <<< 27712 1727096512.53889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096512.53945: stderr chunk (state=3): >>><<< 27712 1727096512.53948: stdout chunk (state=3): >>><<< 27712 1727096512.54043: done transferring module to remote 27712 1727096512.54046: _low_level_execute_command(): starting 27712 1727096512.54049: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096512.4803467-29482-245260328965776/ /root/.ansible/tmp/ansible-tmp-1727096512.4803467-29482-245260328965776/AnsiballZ_command.py && sleep 0' 27712 1727096512.54684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.54780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096512.54783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096512.54802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096512.54844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096512.56652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096512.56690: stderr chunk (state=3): >>><<< 27712 1727096512.56694: stdout chunk (state=3): >>><<< 27712 1727096512.56711: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096512.56714: _low_level_execute_command(): starting 27712 1727096512.56717: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096512.4803467-29482-245260328965776/AnsiballZ_command.py && sleep 0' 27712 1727096512.57147: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096512.57151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096512.57185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096512.57188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.57190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096512.57192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.57248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096512.57251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096512.57257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096512.57295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096512.74387: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "start": "2024-09-23 09:01:52.726569", "end": "2024-09-23 09:01:52.742842", "delta": "0:00:00.016273", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096512.76000: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.126 closed. <<< 27712 1727096512.76028: stderr chunk (state=3): >>><<< 27712 1727096512.76031: stdout chunk (state=3): >>><<< 27712 1727096512.76050: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "start": "2024-09-23 09:01:52.726569", "end": "2024-09-23 09:01:52.742842", "delta": "0:00:00.016273", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.126 closed. 27712 1727096512.76086: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096512.4803467-29482-245260328965776/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096512.76093: _low_level_execute_command(): starting 27712 1727096512.76098: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096512.4803467-29482-245260328965776/ > /dev/null 2>&1 && sleep 0' 27712 1727096512.76543: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096512.76585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096512.76595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096512.76598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.76600: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096512.76602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096512.76604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096512.76646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096512.76651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096512.76657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096512.76703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096512.78564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096512.78594: stderr chunk (state=3): >>><<< 27712 1727096512.78597: stdout chunk (state=3): >>><<< 27712 1727096512.78613: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096512.78619: handler run complete 27712 1727096512.78636: Evaluated conditional (False): False 27712 1727096512.78651: attempt loop complete, returning result 27712 1727096512.78654: _execute() done 27712 1727096512.78673: dumping result to json 27712 1727096512.78676: done dumping result, returning 27712 1727096512.78679: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0afff68d-5257-cbc7-8716-000000000b7c] 27712 1727096512.78695: sending task result for task 0afff68d-5257-cbc7-8716-000000000b7c 27712 1727096512.78807: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b7c 27712 1727096512.78811: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "delta": "0:00:00.016273", "end": "2024-09-23 09:01:52.742842", "rc": 1, "start": "2024-09-23 09:01:52.726569" } MSG: non-zero return code ...ignoring 27712 1727096512.78890: no more pending results, returning what we have 27712 1727096512.78893: results queue empty 27712 1727096512.78894: checking for any_errors_fatal 27712 1727096512.78902: done checking for any_errors_fatal 27712 1727096512.78902: checking for max_fail_percentage 27712 1727096512.78904: done checking for max_fail_percentage 27712 1727096512.78905: checking to see if all hosts have failed and the running result is not ok 27712 1727096512.78906: done checking to see if all hosts have failed 27712 1727096512.78906: getting the remaining hosts for this loop 27712 1727096512.78908: done getting the remaining hosts for this loop 27712 1727096512.78911: getting the next task for host managed_node2 27712 1727096512.78917: done getting next task for host managed_node2 27712 1727096512.78919: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 27712 1727096512.78924: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096512.78928: getting variables 27712 1727096512.78930: in VariableManager get_vars() 27712 1727096512.78972: Calling all_inventory to load vars for managed_node2 27712 1727096512.78975: Calling groups_inventory to load vars for managed_node2 27712 1727096512.78977: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096512.78992: Calling all_plugins_play to load vars for managed_node2 27712 1727096512.78995: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096512.78998: Calling groups_plugins_play to load vars for managed_node2 27712 1727096512.79818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096512.84775: done with get_vars() 27712 1727096512.84792: done getting variables 27712 1727096512.84837: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 09:01:52 -0400 (0:00:00.423) 0:00:38.541 ****** 27712 1727096512.84862: entering _queue_task() for managed_node2/set_fact 27712 1727096512.85134: worker is 1 (out of 1 available) 27712 1727096512.85146: exiting _queue_task() for managed_node2/set_fact 27712 1727096512.85157: done queuing things up, now waiting for results queue to drain 27712 1727096512.85159: waiting for pending results... 27712 1727096512.85357: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 27712 1727096512.85498: in run() - task 0afff68d-5257-cbc7-8716-000000000b7d 27712 1727096512.85515: variable 'ansible_search_path' from source: unknown 27712 1727096512.85518: variable 'ansible_search_path' from source: unknown 27712 1727096512.85553: calling self._execute() 27712 1727096512.85635: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096512.85639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096512.85647: variable 'omit' from source: magic vars 27712 1727096512.85976: variable 'ansible_distribution_major_version' from source: facts 27712 1727096512.85984: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096512.86105: variable 'nm_profile_exists' from source: set_fact 27712 1727096512.86116: Evaluated conditional (nm_profile_exists.rc == 0): False 27712 1727096512.86120: when evaluation is False, skipping this task 27712 1727096512.86123: _execute() done 27712 1727096512.86126: dumping result to json 27712 1727096512.86129: done dumping result, returning 27712 1727096512.86135: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-cbc7-8716-000000000b7d] 27712 1727096512.86140: sending task result for task 0afff68d-5257-cbc7-8716-000000000b7d 27712 1727096512.86226: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b7d 27712 1727096512.86230: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 27712 1727096512.86314: no more pending results, returning what we have 27712 1727096512.86317: results queue empty 27712 1727096512.86318: checking for any_errors_fatal 27712 1727096512.86328: done checking for any_errors_fatal 27712 1727096512.86328: checking for max_fail_percentage 27712 1727096512.86330: done checking for max_fail_percentage 27712 1727096512.86330: checking to see if all hosts have failed and the running result is not ok 27712 1727096512.86331: done checking to see if all hosts have failed 27712 1727096512.86332: getting the remaining hosts for this loop 27712 1727096512.86333: done getting the remaining hosts for this loop 27712 1727096512.86336: getting the next task for host managed_node2 27712 1727096512.86347: done getting next task for host managed_node2 27712 1727096512.86349: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 27712 1727096512.86354: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096512.86357: getting variables 27712 1727096512.86359: in VariableManager get_vars() 27712 1727096512.86395: Calling all_inventory to load vars for managed_node2 27712 1727096512.86398: Calling groups_inventory to load vars for managed_node2 27712 1727096512.86400: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096512.86409: Calling all_plugins_play to load vars for managed_node2 27712 1727096512.86411: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096512.86413: Calling groups_plugins_play to load vars for managed_node2 27712 1727096512.87397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096512.88564: done with get_vars() 27712 1727096512.88582: done getting variables 27712 1727096512.88623: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096512.88730: variable 'profile' from source: include params 27712 1727096512.88734: variable 'item' from source: include params 27712 1727096512.88810: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-ethtest1] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 09:01:52 -0400 (0:00:00.039) 0:00:38.581 ****** 27712 1727096512.88840: entering _queue_task() for managed_node2/command 27712 1727096512.89113: worker is 1 (out of 1 available) 27712 1727096512.89126: exiting _queue_task() for managed_node2/command 27712 1727096512.89140: done queuing things up, now waiting for results queue to drain 27712 1727096512.89141: waiting for pending results... 27712 1727096512.89347: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-ethtest1 27712 1727096512.89451: in run() - task 0afff68d-5257-cbc7-8716-000000000b7f 27712 1727096512.89463: variable 'ansible_search_path' from source: unknown 27712 1727096512.89477: variable 'ansible_search_path' from source: unknown 27712 1727096512.89509: calling self._execute() 27712 1727096512.89614: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096512.89618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096512.89642: variable 'omit' from source: magic vars 27712 1727096512.89902: variable 'ansible_distribution_major_version' from source: facts 27712 1727096512.89910: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096512.89997: variable 'profile_stat' from source: set_fact 27712 1727096512.90006: Evaluated conditional (profile_stat.stat.exists): False 27712 1727096512.90009: when evaluation is False, skipping this task 27712 1727096512.90012: _execute() done 27712 1727096512.90014: dumping result to json 27712 1727096512.90017: done dumping result, returning 27712 1727096512.90023: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-ethtest1 [0afff68d-5257-cbc7-8716-000000000b7f] 27712 1727096512.90028: sending task result for task 0afff68d-5257-cbc7-8716-000000000b7f 27712 1727096512.90111: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b7f 27712 1727096512.90113: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27712 1727096512.90202: no more pending results, returning what we have 27712 1727096512.90205: results queue empty 27712 1727096512.90206: checking for any_errors_fatal 27712 1727096512.90209: done checking for any_errors_fatal 27712 1727096512.90210: checking for max_fail_percentage 27712 1727096512.90212: done checking for max_fail_percentage 27712 1727096512.90212: checking to see if all hosts have failed and the running result is not ok 27712 1727096512.90213: done checking to see if all hosts have failed 27712 1727096512.90214: getting the remaining hosts for this loop 27712 1727096512.90215: done getting the remaining hosts for this loop 27712 1727096512.90218: getting the next task for host managed_node2 27712 1727096512.90225: done getting next task for host managed_node2 27712 1727096512.90228: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 27712 1727096512.90232: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096512.90235: getting variables 27712 1727096512.90236: in VariableManager get_vars() 27712 1727096512.90272: Calling all_inventory to load vars for managed_node2 27712 1727096512.90275: Calling groups_inventory to load vars for managed_node2 27712 1727096512.90277: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096512.90286: Calling all_plugins_play to load vars for managed_node2 27712 1727096512.90288: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096512.90291: Calling groups_plugins_play to load vars for managed_node2 27712 1727096512.91203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096512.92260: done with get_vars() 27712 1727096512.92277: done getting variables 27712 1727096512.92335: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096512.92418: variable 'profile' from source: include params 27712 1727096512.92421: variable 'item' from source: include params 27712 1727096512.92458: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-ethtest1] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 09:01:52 -0400 (0:00:00.036) 0:00:38.618 ****** 27712 1727096512.92484: entering _queue_task() for managed_node2/set_fact 27712 1727096512.92764: worker is 1 (out of 1 available) 27712 1727096512.92778: exiting _queue_task() for managed_node2/set_fact 27712 1727096512.92791: done queuing things up, now waiting for results queue to drain 27712 1727096512.92792: waiting for pending results... 27712 1727096512.92998: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest1 27712 1727096512.93112: in run() - task 0afff68d-5257-cbc7-8716-000000000b80 27712 1727096512.93128: variable 'ansible_search_path' from source: unknown 27712 1727096512.93132: variable 'ansible_search_path' from source: unknown 27712 1727096512.93183: calling self._execute() 27712 1727096512.93260: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096512.93265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096512.93281: variable 'omit' from source: magic vars 27712 1727096512.93616: variable 'ansible_distribution_major_version' from source: facts 27712 1727096512.93637: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096512.93721: variable 'profile_stat' from source: set_fact 27712 1727096512.93727: Evaluated conditional (profile_stat.stat.exists): False 27712 1727096512.93730: when evaluation is False, skipping this task 27712 1727096512.93733: _execute() done 27712 1727096512.93740: dumping result to json 27712 1727096512.93743: done dumping result, returning 27712 1727096512.93746: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest1 [0afff68d-5257-cbc7-8716-000000000b80] 27712 1727096512.93754: sending task result for task 0afff68d-5257-cbc7-8716-000000000b80 27712 1727096512.93849: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b80 27712 1727096512.93853: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27712 1727096512.93897: no more pending results, returning what we have 27712 1727096512.93900: results queue empty 27712 1727096512.93901: checking for any_errors_fatal 27712 1727096512.93913: done checking for any_errors_fatal 27712 1727096512.93914: checking for max_fail_percentage 27712 1727096512.93916: done checking for max_fail_percentage 27712 1727096512.93917: checking to see if all hosts have failed and the running result is not ok 27712 1727096512.93917: done checking to see if all hosts have failed 27712 1727096512.93918: getting the remaining hosts for this loop 27712 1727096512.93919: done getting the remaining hosts for this loop 27712 1727096512.93923: getting the next task for host managed_node2 27712 1727096512.93931: done getting next task for host managed_node2 27712 1727096512.93933: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 27712 1727096512.93938: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096512.93942: getting variables 27712 1727096512.93943: in VariableManager get_vars() 27712 1727096512.94016: Calling all_inventory to load vars for managed_node2 27712 1727096512.94019: Calling groups_inventory to load vars for managed_node2 27712 1727096512.94021: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096512.94030: Calling all_plugins_play to load vars for managed_node2 27712 1727096512.94033: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096512.94035: Calling groups_plugins_play to load vars for managed_node2 27712 1727096512.94929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096512.95989: done with get_vars() 27712 1727096512.96005: done getting variables 27712 1727096512.96044: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096512.96125: variable 'profile' from source: include params 27712 1727096512.96128: variable 'item' from source: include params 27712 1727096512.96184: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-ethtest1] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 09:01:52 -0400 (0:00:00.037) 0:00:38.655 ****** 27712 1727096512.96213: entering _queue_task() for managed_node2/command 27712 1727096512.96524: worker is 1 (out of 1 available) 27712 1727096512.96542: exiting _queue_task() for managed_node2/command 27712 1727096512.96556: done queuing things up, now waiting for results queue to drain 27712 1727096512.96558: waiting for pending results... 27712 1727096512.96989: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-ethtest1 27712 1727096512.97009: in run() - task 0afff68d-5257-cbc7-8716-000000000b81 27712 1727096512.97033: variable 'ansible_search_path' from source: unknown 27712 1727096512.97041: variable 'ansible_search_path' from source: unknown 27712 1727096512.97090: calling self._execute() 27712 1727096512.97278: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096512.97281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096512.97284: variable 'omit' from source: magic vars 27712 1727096512.97662: variable 'ansible_distribution_major_version' from source: facts 27712 1727096512.97666: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096512.97743: variable 'profile_stat' from source: set_fact 27712 1727096512.97751: Evaluated conditional (profile_stat.stat.exists): False 27712 1727096512.97754: when evaluation is False, skipping this task 27712 1727096512.97763: _execute() done 27712 1727096512.97766: dumping result to json 27712 1727096512.97771: done dumping result, returning 27712 1727096512.97779: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-ethtest1 [0afff68d-5257-cbc7-8716-000000000b81] 27712 1727096512.97787: sending task result for task 0afff68d-5257-cbc7-8716-000000000b81 27712 1727096512.97879: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b81 27712 1727096512.97882: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27712 1727096512.97936: no more pending results, returning what we have 27712 1727096512.97940: results queue empty 27712 1727096512.97941: checking for any_errors_fatal 27712 1727096512.97948: done checking for any_errors_fatal 27712 1727096512.97949: checking for max_fail_percentage 27712 1727096512.97950: done checking for max_fail_percentage 27712 1727096512.97951: checking to see if all hosts have failed and the running result is not ok 27712 1727096512.97952: done checking to see if all hosts have failed 27712 1727096512.97953: getting the remaining hosts for this loop 27712 1727096512.97954: done getting the remaining hosts for this loop 27712 1727096512.97957: getting the next task for host managed_node2 27712 1727096512.97963: done getting next task for host managed_node2 27712 1727096512.97966: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 27712 1727096512.97977: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096512.97982: getting variables 27712 1727096512.97983: in VariableManager get_vars() 27712 1727096512.98023: Calling all_inventory to load vars for managed_node2 27712 1727096512.98025: Calling groups_inventory to load vars for managed_node2 27712 1727096512.98027: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096512.98036: Calling all_plugins_play to load vars for managed_node2 27712 1727096512.98038: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096512.98041: Calling groups_plugins_play to load vars for managed_node2 27712 1727096512.98999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096512.99937: done with get_vars() 27712 1727096512.99952: done getting variables 27712 1727096513.00015: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096513.00114: variable 'profile' from source: include params 27712 1727096513.00118: variable 'item' from source: include params 27712 1727096513.00177: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-ethtest1] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 09:01:53 -0400 (0:00:00.039) 0:00:38.695 ****** 27712 1727096513.00203: entering _queue_task() for managed_node2/set_fact 27712 1727096513.00434: worker is 1 (out of 1 available) 27712 1727096513.00451: exiting _queue_task() for managed_node2/set_fact 27712 1727096513.00465: done queuing things up, now waiting for results queue to drain 27712 1727096513.00466: waiting for pending results... 27712 1727096513.00886: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-ethtest1 27712 1727096513.00892: in run() - task 0afff68d-5257-cbc7-8716-000000000b82 27712 1727096513.00896: variable 'ansible_search_path' from source: unknown 27712 1727096513.00900: variable 'ansible_search_path' from source: unknown 27712 1727096513.00904: calling self._execute() 27712 1727096513.01015: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096513.01020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096513.01024: variable 'omit' from source: magic vars 27712 1727096513.01600: variable 'ansible_distribution_major_version' from source: facts 27712 1727096513.01604: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096513.01618: variable 'profile_stat' from source: set_fact 27712 1727096513.01622: Evaluated conditional (profile_stat.stat.exists): False 27712 1727096513.01625: when evaluation is False, skipping this task 27712 1727096513.01628: _execute() done 27712 1727096513.01630: dumping result to json 27712 1727096513.01633: done dumping result, returning 27712 1727096513.01642: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-ethtest1 [0afff68d-5257-cbc7-8716-000000000b82] 27712 1727096513.01772: sending task result for task 0afff68d-5257-cbc7-8716-000000000b82 27712 1727096513.01841: done sending task result for task 0afff68d-5257-cbc7-8716-000000000b82 27712 1727096513.01845: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 27712 1727096513.01898: no more pending results, returning what we have 27712 1727096513.01901: results queue empty 27712 1727096513.01902: checking for any_errors_fatal 27712 1727096513.01911: done checking for any_errors_fatal 27712 1727096513.01911: checking for max_fail_percentage 27712 1727096513.01913: done checking for max_fail_percentage 27712 1727096513.01914: checking to see if all hosts have failed and the running result is not ok 27712 1727096513.01914: done checking to see if all hosts have failed 27712 1727096513.01915: getting the remaining hosts for this loop 27712 1727096513.01916: done getting the remaining hosts for this loop 27712 1727096513.01919: getting the next task for host managed_node2 27712 1727096513.01927: done getting next task for host managed_node2 27712 1727096513.01929: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 27712 1727096513.01934: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096513.01938: getting variables 27712 1727096513.01940: in VariableManager get_vars() 27712 1727096513.01985: Calling all_inventory to load vars for managed_node2 27712 1727096513.01989: Calling groups_inventory to load vars for managed_node2 27712 1727096513.01991: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096513.02003: Calling all_plugins_play to load vars for managed_node2 27712 1727096513.02006: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096513.02009: Calling groups_plugins_play to load vars for managed_node2 27712 1727096513.03667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096513.05330: done with get_vars() 27712 1727096513.05359: done getting variables 27712 1727096513.05422: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 27712 1727096513.05566: variable 'profile' from source: include params 27712 1727096513.05572: variable 'item' from source: include params 27712 1727096513.05630: variable 'item' from source: include params TASK [Assert that the profile is absent - 'ethtest1'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Monday 23 September 2024 09:01:53 -0400 (0:00:00.054) 0:00:38.749 ****** 27712 1727096513.05662: entering _queue_task() for managed_node2/assert 27712 1727096513.06037: worker is 1 (out of 1 available) 27712 1727096513.06051: exiting _queue_task() for managed_node2/assert 27712 1727096513.06063: done queuing things up, now waiting for results queue to drain 27712 1727096513.06064: waiting for pending results... 27712 1727096513.06312: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'ethtest1' 27712 1727096513.06423: in run() - task 0afff68d-5257-cbc7-8716-000000000a72 27712 1727096513.06443: variable 'ansible_search_path' from source: unknown 27712 1727096513.06447: variable 'ansible_search_path' from source: unknown 27712 1727096513.06488: calling self._execute() 27712 1727096513.06598: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096513.06605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096513.06614: variable 'omit' from source: magic vars 27712 1727096513.07126: variable 'ansible_distribution_major_version' from source: facts 27712 1727096513.07171: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096513.07175: variable 'omit' from source: magic vars 27712 1727096513.07195: variable 'omit' from source: magic vars 27712 1727096513.07310: variable 'profile' from source: include params 27712 1727096513.07323: variable 'item' from source: include params 27712 1727096513.07394: variable 'item' from source: include params 27712 1727096513.07461: variable 'omit' from source: magic vars 27712 1727096513.07479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096513.07519: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096513.07550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096513.07582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096513.07599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096513.07696: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096513.07701: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096513.07704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096513.07802: Set connection var ansible_connection to ssh 27712 1727096513.08129: Set connection var ansible_pipelining to False 27712 1727096513.08132: Set connection var ansible_timeout to 10 27712 1727096513.08135: Set connection var ansible_shell_type to sh 27712 1727096513.08138: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096513.08140: Set connection var ansible_shell_executable to /bin/sh 27712 1727096513.08143: variable 'ansible_shell_executable' from source: unknown 27712 1727096513.08145: variable 'ansible_connection' from source: unknown 27712 1727096513.08147: variable 'ansible_module_compression' from source: unknown 27712 1727096513.08149: variable 'ansible_shell_type' from source: unknown 27712 1727096513.08151: variable 'ansible_shell_executable' from source: unknown 27712 1727096513.08153: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096513.08154: variable 'ansible_pipelining' from source: unknown 27712 1727096513.08157: variable 'ansible_timeout' from source: unknown 27712 1727096513.08159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096513.08302: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096513.08373: variable 'omit' from source: magic vars 27712 1727096513.08383: starting attempt loop 27712 1727096513.08388: running the handler 27712 1727096513.08470: variable 'lsr_net_profile_exists' from source: set_fact 27712 1727096513.08484: Evaluated conditional (not lsr_net_profile_exists): True 27712 1727096513.08573: handler run complete 27712 1727096513.08577: attempt loop complete, returning result 27712 1727096513.08579: _execute() done 27712 1727096513.08581: dumping result to json 27712 1727096513.08584: done dumping result, returning 27712 1727096513.08587: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'ethtest1' [0afff68d-5257-cbc7-8716-000000000a72] 27712 1727096513.08589: sending task result for task 0afff68d-5257-cbc7-8716-000000000a72 27712 1727096513.08663: done sending task result for task 0afff68d-5257-cbc7-8716-000000000a72 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 27712 1727096513.08720: no more pending results, returning what we have 27712 1727096513.08724: results queue empty 27712 1727096513.08726: checking for any_errors_fatal 27712 1727096513.08733: done checking for any_errors_fatal 27712 1727096513.08733: checking for max_fail_percentage 27712 1727096513.08735: done checking for max_fail_percentage 27712 1727096513.08736: checking to see if all hosts have failed and the running result is not ok 27712 1727096513.08737: done checking to see if all hosts have failed 27712 1727096513.08738: getting the remaining hosts for this loop 27712 1727096513.08740: done getting the remaining hosts for this loop 27712 1727096513.08743: getting the next task for host managed_node2 27712 1727096513.08752: done getting next task for host managed_node2 27712 1727096513.08756: ^ task is: TASK: Verify network state restored to default 27712 1727096513.08760: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096513.08766: getting variables 27712 1727096513.08771: in VariableManager get_vars() 27712 1727096513.08815: Calling all_inventory to load vars for managed_node2 27712 1727096513.08818: Calling groups_inventory to load vars for managed_node2 27712 1727096513.08821: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096513.08834: Calling all_plugins_play to load vars for managed_node2 27712 1727096513.08837: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096513.08840: Calling groups_plugins_play to load vars for managed_node2 27712 1727096513.09750: WORKER PROCESS EXITING 27712 1727096513.11523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096513.12704: done with get_vars() 27712 1727096513.12720: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:169 Monday 23 September 2024 09:01:53 -0400 (0:00:00.071) 0:00:38.821 ****** 27712 1727096513.12790: entering _queue_task() for managed_node2/include_tasks 27712 1727096513.13010: worker is 1 (out of 1 available) 27712 1727096513.13023: exiting _queue_task() for managed_node2/include_tasks 27712 1727096513.13034: done queuing things up, now waiting for results queue to drain 27712 1727096513.13035: waiting for pending results... 27712 1727096513.13242: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 27712 1727096513.13403: in run() - task 0afff68d-5257-cbc7-8716-0000000000bb 27712 1727096513.13423: variable 'ansible_search_path' from source: unknown 27712 1727096513.13475: calling self._execute() 27712 1727096513.13587: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096513.13598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096513.13611: variable 'omit' from source: magic vars 27712 1727096513.14016: variable 'ansible_distribution_major_version' from source: facts 27712 1727096513.14035: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096513.14047: _execute() done 27712 1727096513.14056: dumping result to json 27712 1727096513.14064: done dumping result, returning 27712 1727096513.14076: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [0afff68d-5257-cbc7-8716-0000000000bb] 27712 1727096513.14100: sending task result for task 0afff68d-5257-cbc7-8716-0000000000bb 27712 1727096513.14229: no more pending results, returning what we have 27712 1727096513.14234: in VariableManager get_vars() 27712 1727096513.14286: Calling all_inventory to load vars for managed_node2 27712 1727096513.14290: Calling groups_inventory to load vars for managed_node2 27712 1727096513.14293: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096513.14307: Calling all_plugins_play to load vars for managed_node2 27712 1727096513.14310: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096513.14314: Calling groups_plugins_play to load vars for managed_node2 27712 1727096513.15082: done sending task result for task 0afff68d-5257-cbc7-8716-0000000000bb 27712 1727096513.15086: WORKER PROCESS EXITING 27712 1727096513.16023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096513.17912: done with get_vars() 27712 1727096513.17940: variable 'ansible_search_path' from source: unknown 27712 1727096513.17975: we have included files to process 27712 1727096513.17977: generating all_blocks data 27712 1727096513.17979: done generating all_blocks data 27712 1727096513.17984: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 27712 1727096513.17985: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 27712 1727096513.17987: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 27712 1727096513.18712: done processing included file 27712 1727096513.18715: iterating over new_blocks loaded from include file 27712 1727096513.18716: in VariableManager get_vars() 27712 1727096513.18734: done with get_vars() 27712 1727096513.18735: filtering new block on tags 27712 1727096513.18933: done filtering new block on tags 27712 1727096513.18937: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 27712 1727096513.18942: extending task lists for all hosts with included blocks 27712 1727096513.21155: done extending task lists 27712 1727096513.21156: done processing included files 27712 1727096513.21157: results queue empty 27712 1727096513.21158: checking for any_errors_fatal 27712 1727096513.21160: done checking for any_errors_fatal 27712 1727096513.21161: checking for max_fail_percentage 27712 1727096513.21162: done checking for max_fail_percentage 27712 1727096513.21162: checking to see if all hosts have failed and the running result is not ok 27712 1727096513.21163: done checking to see if all hosts have failed 27712 1727096513.21164: getting the remaining hosts for this loop 27712 1727096513.21165: done getting the remaining hosts for this loop 27712 1727096513.21169: getting the next task for host managed_node2 27712 1727096513.21172: done getting next task for host managed_node2 27712 1727096513.21180: ^ task is: TASK: Check routes and DNS 27712 1727096513.21197: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096513.21200: getting variables 27712 1727096513.21201: in VariableManager get_vars() 27712 1727096513.21219: Calling all_inventory to load vars for managed_node2 27712 1727096513.21221: Calling groups_inventory to load vars for managed_node2 27712 1727096513.21224: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096513.21229: Calling all_plugins_play to load vars for managed_node2 27712 1727096513.21231: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096513.21234: Calling groups_plugins_play to load vars for managed_node2 27712 1727096513.22428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096513.24001: done with get_vars() 27712 1727096513.24055: done getting variables 27712 1727096513.24097: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Monday 23 September 2024 09:01:53 -0400 (0:00:00.113) 0:00:38.934 ****** 27712 1727096513.24124: entering _queue_task() for managed_node2/shell 27712 1727096513.24499: worker is 1 (out of 1 available) 27712 1727096513.24511: exiting _queue_task() for managed_node2/shell 27712 1727096513.24533: done queuing things up, now waiting for results queue to drain 27712 1727096513.24535: waiting for pending results... 27712 1727096513.24784: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 27712 1727096513.24894: in run() - task 0afff68d-5257-cbc7-8716-000000000bb6 27712 1727096513.24907: variable 'ansible_search_path' from source: unknown 27712 1727096513.24911: variable 'ansible_search_path' from source: unknown 27712 1727096513.24955: calling self._execute() 27712 1727096513.25038: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096513.25042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096513.25058: variable 'omit' from source: magic vars 27712 1727096513.25333: variable 'ansible_distribution_major_version' from source: facts 27712 1727096513.25343: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096513.25348: variable 'omit' from source: magic vars 27712 1727096513.25387: variable 'omit' from source: magic vars 27712 1727096513.25420: variable 'omit' from source: magic vars 27712 1727096513.25450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096513.25482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096513.25497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096513.25510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096513.25520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096513.25542: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096513.25546: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096513.25549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096513.25621: Set connection var ansible_connection to ssh 27712 1727096513.25628: Set connection var ansible_pipelining to False 27712 1727096513.25633: Set connection var ansible_timeout to 10 27712 1727096513.25636: Set connection var ansible_shell_type to sh 27712 1727096513.25642: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096513.25647: Set connection var ansible_shell_executable to /bin/sh 27712 1727096513.25663: variable 'ansible_shell_executable' from source: unknown 27712 1727096513.25666: variable 'ansible_connection' from source: unknown 27712 1727096513.25671: variable 'ansible_module_compression' from source: unknown 27712 1727096513.25676: variable 'ansible_shell_type' from source: unknown 27712 1727096513.25678: variable 'ansible_shell_executable' from source: unknown 27712 1727096513.25681: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096513.25683: variable 'ansible_pipelining' from source: unknown 27712 1727096513.25686: variable 'ansible_timeout' from source: unknown 27712 1727096513.25690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096513.25787: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096513.25798: variable 'omit' from source: magic vars 27712 1727096513.25802: starting attempt loop 27712 1727096513.25805: running the handler 27712 1727096513.25819: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096513.25829: _low_level_execute_command(): starting 27712 1727096513.25836: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096513.26492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096513.26498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096513.26521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096513.26597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096513.28269: stdout chunk (state=3): >>>/root <<< 27712 1727096513.28374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096513.28405: stderr chunk (state=3): >>><<< 27712 1727096513.28407: stdout chunk (state=3): >>><<< 27712 1727096513.28420: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096513.28449: _low_level_execute_command(): starting 27712 1727096513.28453: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096513.2842479-29519-246225293481820 `" && echo ansible-tmp-1727096513.2842479-29519-246225293481820="` echo /root/.ansible/tmp/ansible-tmp-1727096513.2842479-29519-246225293481820 `" ) && sleep 0' 27712 1727096513.29113: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096513.29236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096513.29240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096513.29242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096513.29346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096513.31195: stdout chunk (state=3): >>>ansible-tmp-1727096513.2842479-29519-246225293481820=/root/.ansible/tmp/ansible-tmp-1727096513.2842479-29519-246225293481820 <<< 27712 1727096513.31300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096513.31346: stderr chunk (state=3): >>><<< 27712 1727096513.31352: stdout chunk (state=3): >>><<< 27712 1727096513.31370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096513.2842479-29519-246225293481820=/root/.ansible/tmp/ansible-tmp-1727096513.2842479-29519-246225293481820 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096513.31425: variable 'ansible_module_compression' from source: unknown 27712 1727096513.31498: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096513.31546: variable 'ansible_facts' from source: unknown 27712 1727096513.31691: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096513.2842479-29519-246225293481820/AnsiballZ_command.py 27712 1727096513.31854: Sending initial data 27712 1727096513.31857: Sent initial data (156 bytes) 27712 1727096513.33014: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096513.33071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096513.33093: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096513.33161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096513.33198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096513.33238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096513.34895: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096513.34917: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096513.2842479-29519-246225293481820/AnsiballZ_command.py" <<< 27712 1727096513.34989: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpubpqtr04 /root/.ansible/tmp/ansible-tmp-1727096513.2842479-29519-246225293481820/AnsiballZ_command.py <<< 27712 1727096513.34992: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpubpqtr04" to remote "/root/.ansible/tmp/ansible-tmp-1727096513.2842479-29519-246225293481820/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096513.2842479-29519-246225293481820/AnsiballZ_command.py" <<< 27712 1727096513.35755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096513.35858: stderr chunk (state=3): >>><<< 27712 1727096513.35861: stdout chunk (state=3): >>><<< 27712 1727096513.35863: done transferring module to remote 27712 1727096513.35865: _low_level_execute_command(): starting 27712 1727096513.35868: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096513.2842479-29519-246225293481820/ /root/.ansible/tmp/ansible-tmp-1727096513.2842479-29519-246225293481820/AnsiballZ_command.py && sleep 0' 27712 1727096513.36434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096513.36456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096513.36487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096513.36523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096513.36608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096513.36836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096513.36839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096513.38622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096513.38649: stderr chunk (state=3): >>><<< 27712 1727096513.38653: stdout chunk (state=3): >>><<< 27712 1727096513.38668: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096513.38672: _low_level_execute_command(): starting 27712 1727096513.38678: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096513.2842479-29519-246225293481820/AnsiballZ_command.py && sleep 0' 27712 1727096513.39098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096513.39101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096513.39104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096513.39106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096513.39145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096513.39149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096513.39203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096513.55436: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:ce:61:4d:8f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.126/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3094sec preferred_lft 3094sec\n inet6 fe80::8ff:ceff:fe61:4d8f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.126 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.126 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 09:01:53.544371", "end": "2024-09-23 09:01:53.553040", "delta": "0:00:00.008669", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096513.57278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096513.57282: stdout chunk (state=3): >>><<< 27712 1727096513.57289: stderr chunk (state=3): >>><<< 27712 1727096513.57293: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:ce:61:4d:8f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.126/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3094sec preferred_lft 3094sec\n inet6 fe80::8ff:ceff:fe61:4d8f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.126 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.126 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 09:01:53.544371", "end": "2024-09-23 09:01:53.553040", "delta": "0:00:00.008669", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096513.57301: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096513.2842479-29519-246225293481820/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096513.57304: _low_level_execute_command(): starting 27712 1727096513.57306: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096513.2842479-29519-246225293481820/ > /dev/null 2>&1 && sleep 0' 27712 1727096513.57760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096513.57769: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096513.57783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096513.57801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 27712 1727096513.57846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 <<< 27712 1727096513.57892: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096513.58099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096513.58277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096513.58398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096513.60188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096513.60192: stdout chunk (state=3): >>><<< 27712 1727096513.60197: stderr chunk (state=3): >>><<< 27712 1727096513.60214: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096513.60221: handler run complete 27712 1727096513.60244: Evaluated conditional (False): False 27712 1727096513.60255: attempt loop complete, returning result 27712 1727096513.60258: _execute() done 27712 1727096513.60260: dumping result to json 27712 1727096513.60267: done dumping result, returning 27712 1727096513.60388: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0afff68d-5257-cbc7-8716-000000000bb6] 27712 1727096513.60391: sending task result for task 0afff68d-5257-cbc7-8716-000000000bb6 27712 1727096513.60514: done sending task result for task 0afff68d-5257-cbc7-8716-000000000bb6 27712 1727096513.60518: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008669", "end": "2024-09-23 09:01:53.553040", "rc": 0, "start": "2024-09-23 09:01:53.544371" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:ce:61:4d:8f brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.15.126/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3094sec preferred_lft 3094sec inet6 fe80::8ff:ceff:fe61:4d8f/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.126 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.126 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 27712 1727096513.60787: no more pending results, returning what we have 27712 1727096513.60791: results queue empty 27712 1727096513.60792: checking for any_errors_fatal 27712 1727096513.60794: done checking for any_errors_fatal 27712 1727096513.60795: checking for max_fail_percentage 27712 1727096513.60797: done checking for max_fail_percentage 27712 1727096513.60798: checking to see if all hosts have failed and the running result is not ok 27712 1727096513.60799: done checking to see if all hosts have failed 27712 1727096513.60799: getting the remaining hosts for this loop 27712 1727096513.60801: done getting the remaining hosts for this loop 27712 1727096513.60805: getting the next task for host managed_node2 27712 1727096513.60812: done getting next task for host managed_node2 27712 1727096513.60814: ^ task is: TASK: Verify DNS and network connectivity 27712 1727096513.60819: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 27712 1727096513.60829: getting variables 27712 1727096513.60831: in VariableManager get_vars() 27712 1727096513.61074: Calling all_inventory to load vars for managed_node2 27712 1727096513.61078: Calling groups_inventory to load vars for managed_node2 27712 1727096513.61080: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096513.61092: Calling all_plugins_play to load vars for managed_node2 27712 1727096513.61095: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096513.61098: Calling groups_plugins_play to load vars for managed_node2 27712 1727096513.64226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096513.66332: done with get_vars() 27712 1727096513.66366: done getting variables 27712 1727096513.66429: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Monday 23 September 2024 09:01:53 -0400 (0:00:00.423) 0:00:39.357 ****** 27712 1727096513.66462: entering _queue_task() for managed_node2/shell 27712 1727096513.66850: worker is 1 (out of 1 available) 27712 1727096513.66864: exiting _queue_task() for managed_node2/shell 27712 1727096513.66882: done queuing things up, now waiting for results queue to drain 27712 1727096513.66884: waiting for pending results... 27712 1727096513.67204: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 27712 1727096513.67325: in run() - task 0afff68d-5257-cbc7-8716-000000000bb7 27712 1727096513.67346: variable 'ansible_search_path' from source: unknown 27712 1727096513.67401: variable 'ansible_search_path' from source: unknown 27712 1727096513.67406: calling self._execute() 27712 1727096513.67516: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096513.67527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096513.67543: variable 'omit' from source: magic vars 27712 1727096513.68377: variable 'ansible_distribution_major_version' from source: facts 27712 1727096513.68381: Evaluated conditional (ansible_distribution_major_version != '6'): True 27712 1727096513.68551: variable 'ansible_facts' from source: unknown 27712 1727096513.69599: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 27712 1727096513.69611: variable 'omit' from source: magic vars 27712 1727096513.69653: variable 'omit' from source: magic vars 27712 1727096513.69693: variable 'omit' from source: magic vars 27712 1727096513.69744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 27712 1727096513.69790: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 27712 1727096513.69819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 27712 1727096513.69843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096513.69861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 27712 1727096513.69900: variable 'inventory_hostname' from source: host vars for 'managed_node2' 27712 1727096513.69926: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096513.69929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096513.70023: Set connection var ansible_connection to ssh 27712 1727096513.70143: Set connection var ansible_pipelining to False 27712 1727096513.70145: Set connection var ansible_timeout to 10 27712 1727096513.70147: Set connection var ansible_shell_type to sh 27712 1727096513.70149: Set connection var ansible_module_compression to ZIP_DEFLATED 27712 1727096513.70150: Set connection var ansible_shell_executable to /bin/sh 27712 1727096513.70153: variable 'ansible_shell_executable' from source: unknown 27712 1727096513.70154: variable 'ansible_connection' from source: unknown 27712 1727096513.70156: variable 'ansible_module_compression' from source: unknown 27712 1727096513.70158: variable 'ansible_shell_type' from source: unknown 27712 1727096513.70159: variable 'ansible_shell_executable' from source: unknown 27712 1727096513.70161: variable 'ansible_host' from source: host vars for 'managed_node2' 27712 1727096513.70162: variable 'ansible_pipelining' from source: unknown 27712 1727096513.70164: variable 'ansible_timeout' from source: unknown 27712 1727096513.70166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 27712 1727096513.70262: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096513.70281: variable 'omit' from source: magic vars 27712 1727096513.70289: starting attempt loop 27712 1727096513.70295: running the handler 27712 1727096513.70308: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 27712 1727096513.70328: _low_level_execute_command(): starting 27712 1727096513.70338: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 27712 1727096513.71024: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096513.71039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096513.71056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096513.71131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096513.71181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' <<< 27712 1727096513.71200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096513.71228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096513.71301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096513.72964: stdout chunk (state=3): >>>/root <<< 27712 1727096513.73082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096513.73091: stderr chunk (state=3): >>><<< 27712 1727096513.73095: stdout chunk (state=3): >>><<< 27712 1727096513.73125: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096513.73136: _low_level_execute_command(): starting 27712 1727096513.73142: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096513.7312393-29540-170494980145635 `" && echo ansible-tmp-1727096513.7312393-29540-170494980145635="` echo /root/.ansible/tmp/ansible-tmp-1727096513.7312393-29540-170494980145635 `" ) && sleep 0' 27712 1727096513.73540: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096513.73570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 27712 1727096513.73577: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found <<< 27712 1727096513.73580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096513.73630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096513.73638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096513.73673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096513.75978: stdout chunk (state=3): >>>ansible-tmp-1727096513.7312393-29540-170494980145635=/root/.ansible/tmp/ansible-tmp-1727096513.7312393-29540-170494980145635 <<< 27712 1727096513.75982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096513.75985: stdout chunk (state=3): >>><<< 27712 1727096513.75987: stderr chunk (state=3): >>><<< 27712 1727096513.75989: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096513.7312393-29540-170494980145635=/root/.ansible/tmp/ansible-tmp-1727096513.7312393-29540-170494980145635 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096513.75991: variable 'ansible_module_compression' from source: unknown 27712 1727096513.75993: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-277127eokbzf0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 27712 1727096513.75995: variable 'ansible_facts' from source: unknown 27712 1727096513.76058: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096513.7312393-29540-170494980145635/AnsiballZ_command.py 27712 1727096513.76213: Sending initial data 27712 1727096513.76216: Sent initial data (156 bytes) 27712 1727096513.76778: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096513.76784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096513.76856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096513.76880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096513.76902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096513.78451: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 27712 1727096513.78491: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 27712 1727096513.78554: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-277127eokbzf0/tmpzgz6rvn8 /root/.ansible/tmp/ansible-tmp-1727096513.7312393-29540-170494980145635/AnsiballZ_command.py <<< 27712 1727096513.78557: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096513.7312393-29540-170494980145635/AnsiballZ_command.py" <<< 27712 1727096513.78609: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-277127eokbzf0/tmpzgz6rvn8" to remote "/root/.ansible/tmp/ansible-tmp-1727096513.7312393-29540-170494980145635/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096513.7312393-29540-170494980145635/AnsiballZ_command.py" <<< 27712 1727096513.79322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096513.79386: stderr chunk (state=3): >>><<< 27712 1727096513.79389: stdout chunk (state=3): >>><<< 27712 1727096513.79391: done transferring module to remote 27712 1727096513.79393: _low_level_execute_command(): starting 27712 1727096513.79396: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096513.7312393-29540-170494980145635/ /root/.ansible/tmp/ansible-tmp-1727096513.7312393-29540-170494980145635/AnsiballZ_command.py && sleep 0' 27712 1727096513.79844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096513.79860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096513.79900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096513.81676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096513.81738: stderr chunk (state=3): >>><<< 27712 1727096513.81742: stdout chunk (state=3): >>><<< 27712 1727096513.81774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096513.81778: _low_level_execute_command(): starting 27712 1727096513.81780: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096513.7312393-29540-170494980145635/AnsiballZ_command.py && sleep 0' 27712 1727096513.82190: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096513.82193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found <<< 27712 1727096513.82196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration <<< 27712 1727096513.82198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 27712 1727096513.82200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096513.82310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096513.82313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096514.28609: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1473 0 --:--:-- --:--:-- --:--:-- 1480\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3717 0 --:--:-- --:--:-- --:--:-- 3730", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 09:01:53.974756", "end": "2024-09-23 09:01:54.284539", "delta": "0:00:00.309783", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 27712 1727096514.30380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. <<< 27712 1727096514.30384: stdout chunk (state=3): >>><<< 27712 1727096514.30386: stderr chunk (state=3): >>><<< 27712 1727096514.30389: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1473 0 --:--:-- --:--:-- --:--:-- 1480\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3717 0 --:--:-- --:--:-- --:--:-- 3730", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 09:01:53.974756", "end": "2024-09-23 09:01:54.284539", "delta": "0:00:00.309783", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.126 closed. 27712 1727096514.30400: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096513.7312393-29540-170494980145635/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 27712 1727096514.30403: _low_level_execute_command(): starting 27712 1727096514.30405: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096513.7312393-29540-170494980145635/ > /dev/null 2>&1 && sleep 0' 27712 1727096514.31032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 27712 1727096514.31040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 27712 1727096514.31086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 27712 1727096514.31092: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 27712 1727096514.31181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK <<< 27712 1727096514.31204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 27712 1727096514.31264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 27712 1727096514.33160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 27712 1727096514.33163: stdout chunk (state=3): >>><<< 27712 1727096514.33166: stderr chunk (state=3): >>><<< 27712 1727096514.33191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.126 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.126 originally 10.31.15.126 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/35282dee7b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 27712 1727096514.33203: handler run complete 27712 1727096514.33375: Evaluated conditional (False): False 27712 1727096514.33378: attempt loop complete, returning result 27712 1727096514.33381: _execute() done 27712 1727096514.33383: dumping result to json 27712 1727096514.33385: done dumping result, returning 27712 1727096514.33388: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0afff68d-5257-cbc7-8716-000000000bb7] 27712 1727096514.33390: sending task result for task 0afff68d-5257-cbc7-8716-000000000bb7 27712 1727096514.33466: done sending task result for task 0afff68d-5257-cbc7-8716-000000000bb7 ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.309783", "end": "2024-09-23 09:01:54.284539", "rc": 0, "start": "2024-09-23 09:01:53.974756" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1473 0 --:--:-- --:--:-- --:--:-- 1480 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 3717 0 --:--:-- --:--:-- --:--:-- 3730 27712 1727096514.33548: no more pending results, returning what we have 27712 1727096514.33552: results queue empty 27712 1727096514.33553: checking for any_errors_fatal 27712 1727096514.33565: done checking for any_errors_fatal 27712 1727096514.33566: checking for max_fail_percentage 27712 1727096514.33575: done checking for max_fail_percentage 27712 1727096514.33577: checking to see if all hosts have failed and the running result is not ok 27712 1727096514.33578: done checking to see if all hosts have failed 27712 1727096514.33578: getting the remaining hosts for this loop 27712 1727096514.33589: done getting the remaining hosts for this loop 27712 1727096514.33593: getting the next task for host managed_node2 27712 1727096514.33604: done getting next task for host managed_node2 27712 1727096514.33611: WORKER PROCESS EXITING 27712 1727096514.33617: ^ task is: TASK: meta (flush_handlers) 27712 1727096514.33622: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096514.33628: getting variables 27712 1727096514.33629: in VariableManager get_vars() 27712 1727096514.33780: Calling all_inventory to load vars for managed_node2 27712 1727096514.33783: Calling groups_inventory to load vars for managed_node2 27712 1727096514.33787: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096514.33806: Calling all_plugins_play to load vars for managed_node2 27712 1727096514.33810: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096514.33814: Calling groups_plugins_play to load vars for managed_node2 27712 1727096514.35483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096514.37176: done with get_vars() 27712 1727096514.37200: done getting variables 27712 1727096514.37282: in VariableManager get_vars() 27712 1727096514.37297: Calling all_inventory to load vars for managed_node2 27712 1727096514.37299: Calling groups_inventory to load vars for managed_node2 27712 1727096514.37301: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096514.37306: Calling all_plugins_play to load vars for managed_node2 27712 1727096514.37308: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096514.37311: Calling groups_plugins_play to load vars for managed_node2 27712 1727096514.38815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096514.40358: done with get_vars() 27712 1727096514.40385: done queuing things up, now waiting for results queue to drain 27712 1727096514.40387: results queue empty 27712 1727096514.40388: checking for any_errors_fatal 27712 1727096514.40391: done checking for any_errors_fatal 27712 1727096514.40392: checking for max_fail_percentage 27712 1727096514.40393: done checking for max_fail_percentage 27712 1727096514.40394: checking to see if all hosts have failed and the running result is not ok 27712 1727096514.40395: done checking to see if all hosts have failed 27712 1727096514.40395: getting the remaining hosts for this loop 27712 1727096514.40396: done getting the remaining hosts for this loop 27712 1727096514.40399: getting the next task for host managed_node2 27712 1727096514.40403: done getting next task for host managed_node2 27712 1727096514.40405: ^ task is: TASK: meta (flush_handlers) 27712 1727096514.40406: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096514.40409: getting variables 27712 1727096514.40410: in VariableManager get_vars() 27712 1727096514.40422: Calling all_inventory to load vars for managed_node2 27712 1727096514.40424: Calling groups_inventory to load vars for managed_node2 27712 1727096514.40426: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096514.40431: Calling all_plugins_play to load vars for managed_node2 27712 1727096514.40433: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096514.40436: Calling groups_plugins_play to load vars for managed_node2 27712 1727096514.41503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096514.43025: done with get_vars() 27712 1727096514.43046: done getting variables 27712 1727096514.43101: in VariableManager get_vars() 27712 1727096514.43115: Calling all_inventory to load vars for managed_node2 27712 1727096514.43118: Calling groups_inventory to load vars for managed_node2 27712 1727096514.43120: Calling all_plugins_inventory to load vars for managed_node2 27712 1727096514.43125: Calling all_plugins_play to load vars for managed_node2 27712 1727096514.43127: Calling groups_plugins_inventory to load vars for managed_node2 27712 1727096514.43130: Calling groups_plugins_play to load vars for managed_node2 27712 1727096514.44304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 27712 1727096514.45992: done with get_vars() 27712 1727096514.46021: done queuing things up, now waiting for results queue to drain 27712 1727096514.46024: results queue empty 27712 1727096514.46024: checking for any_errors_fatal 27712 1727096514.46026: done checking for any_errors_fatal 27712 1727096514.46026: checking for max_fail_percentage 27712 1727096514.46027: done checking for max_fail_percentage 27712 1727096514.46028: checking to see if all hosts have failed and the running result is not ok 27712 1727096514.46028: done checking to see if all hosts have failed 27712 1727096514.46029: getting the remaining hosts for this loop 27712 1727096514.46030: done getting the remaining hosts for this loop 27712 1727096514.46033: getting the next task for host managed_node2 27712 1727096514.46037: done getting next task for host managed_node2 27712 1727096514.46038: ^ task is: None 27712 1727096514.46039: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 27712 1727096514.46041: done queuing things up, now waiting for results queue to drain 27712 1727096514.46041: results queue empty 27712 1727096514.46042: checking for any_errors_fatal 27712 1727096514.46043: done checking for any_errors_fatal 27712 1727096514.46043: checking for max_fail_percentage 27712 1727096514.46044: done checking for max_fail_percentage 27712 1727096514.46045: checking to see if all hosts have failed and the running result is not ok 27712 1727096514.46046: done checking to see if all hosts have failed 27712 1727096514.46048: getting the next task for host managed_node2 27712 1727096514.46050: done getting next task for host managed_node2 27712 1727096514.46051: ^ task is: None 27712 1727096514.46052: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=107 changed=3 unreachable=0 failed=0 skipped=88 rescued=0 ignored=2 Monday 23 September 2024 09:01:54 -0400 (0:00:00.798) 0:00:40.156 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.95s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.89s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.86s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.62s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:6 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.25s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.17s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:3 Create veth interface ethtest0 ------------------------------------------ 1.17s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.06s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Create veth interface ethtest1 ------------------------------------------ 1.02s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 0.95s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.95s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Check if system is ostree ----------------------------------------------- 0.93s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.81s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Install iproute --------------------------------------------------------- 0.80s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Verify DNS and network connectivity ------------------------------------- 0.80s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Check which packages are installed --- 0.77s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Install iproute --------------------------------------------------------- 0.72s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.71s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.66s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather current interface info ------------------------------------------- 0.66s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 27712 1727096514.46395: RUNNING CLEANUP